Social Networking

Recently in class, we had a visit from social networking/media expert Wayne McPhail. He discussed Web 2.0 and some associated concepts. Web 2.0 is a buzzword I’ve heard a lot recently, but I don’t think I really knew what it was all about until Wayne explained it.

Basically, Web 2.0 is a marketing term that refers to several current trends in web based media. Web 2.0 encourages community and collaboration, encourages shared content, focuses on single tasks, uses clean clear interfaces, supports tagging and social bookmarking, and moves data and applications from the desktop to the web.

Social bookmarking moves personal bookmarks to public space. It makes use of tagging, which means creating adhoc keywords that are non-hierarchical. Social bookmarking creates collective intelligence and collective editing. A prime example of a site dedicated to social bookmarking is del.icio.us. Del.icio.us is a collection of individual people’s personal favourite articles, blogs, music, reviews, recipes, etc, that can be shared with friends, coworkers, family and the del.icio.us community. The result is that everything on del.icio.us is someone’s favourite, which promotes community because it helps you remember your favourites and share them with others who may benefit from them, plus you can access your favourites from any computer on the web.

Another social bookmarking tool that Wayne discussed is Jaiku. Jaiku allows you to create your own activity stream where you can post Jaikus, add icons, customize your design, share your Web feeds. The result is an aggregate of all the flakes of your presence on the web. Every time you post a photo, video, comment on twitter, it gets collected into one RSS feed. You can also see what your friends are up to;  you can see their availability, location, and calendar events if they have Jaiku Mobile on their phone.

Another social media tool that Wayne showed us is the mogulus player. Mogulus allows you to start a live broadcast from a webcam by creating your own broadcast channel. The prospect of broadcasting live in real time directly on to the web is extremely exciting. At any given time there are 26 different live broadcasts within the Mogulus grid running simultaneously. You can even share your channel with others by embedding it on your blog or Facebook profile.

A tool that works really well in conjunction with mogulus is the ability to stream live video directly from your cellphone, which you can do with QIK. Currently, they only support Nokia S60 phones. If you have one of those phones you can stream videos directly from your phone to the Web. You can even stream directly to friends in Facebook, Twitter, or to your channel in mogulus.

Wayne also made some interesting points about the rules of social networking/media. The same basic rules and values that apply to society such as awareness, education and involvement also apply to social networking. You should participate in a social network rather than using it to sell stuff. This is where one could criticize perhaps the biggest online social network of them all – Facebook. Many people have signed up on facebook to share pictures, create events and reconnect with old friends, but there is no doubt that it is being heavily plagued by excessive advertising. You definitely can’t just enjoy a nice game of scrabulous without seeing a billion ads. The take home message is that social media is a conversation, not a broadcast. It is like an ecosystem where everyone has a niche and their footprint in that niche can impact others either directly or indirectly. The ecosystem has a much better chance of proliferating if the organisms within it obey the rules of give and take so that it does not become a monoculture.

March 13, 2008 at 4:30 pm 1 comment

Dinosaurs and Hydraulophones

In this post I will be covering everything from the prehistoric age to the post-cyborg age. Once again, my IMM class at Sheridan College took a field trip to Toronto. Our first stop was the Royal Ontario Museum where we were guided through some of their interactive displays by the Director of New Media, Brian Porter.

We spent the first part of the tour in the digital gallery where we were able to sit at interactive computer stations equiped with touch screens. We could view either the Ancient Egypt exhibit or the Canadian Heritage exhibit. You could surf through different artifacts from the selected exhibit and then read a blurb of info about them and zoom in, pan, turn on each object. I can see it being quite enjoyable for kids! Also, in the digital gallery there was a large film screen which had a three-camera projector and we got to watch the beginning of an educational film about Canadian Heritage that was made entirely by staff at the ROM. I was quite professional and had some interesting transitions due to the ability to edit film differently for each camera.

Next we moved to the Dinosaur Exhibit which was really quite stunning. Although the intention was for us to examine the interactive computer screens, which displayed video and fun dinosaur facts – personally I found the real bones and carcasses that were on display to be far more interesting and awe-inspiring. There were no major problems with the interactive screens, but we’re used to seeing dinosaurs digitally, we’re just not used to seeing their real (though dead) physical presence, which has much more impact.

After the dinos, we proceeded to check out the office of New Media staff member Zack. He has the coolest job in the world, he gets to make/record music, do a little actionscript, edit films all in one cozy little office.

After the ROM we went on a slushy walk through the streets of the University of Toronto campus, to meet Professor Steve Mann, this man defines the term “mad scientist” – he’s got the workspace of a mad inventor with all kinds of random stuff cluttered everywhere, including a baby stroller equipped with an eject button! He is well known for being the world’s first cyborg – because of the strange eyepiece he wears (see photo below) which allows him to experience computer mediated reality.

steve mann

Steve Mann had some pretty interesting things to say about water, he has built a musical instrument called the hydraulophone, which makes these beautiful keyboard like sounds by direct physical contact with hydraulic fluid (water) to generate sound hydraulically. The largest hydraulophone in the world (the FUNtain) can be found outside the Ontario Science Center, it was made by Steve Mann and his colleagues. Apparently Steve Mann has also been able to make something similar to Midi, which he calls fluidi, so you can use the hydraulophone to control sounds that would come from different musical instruments. Pretty amazing! He even talked about being connected over the internet and using a hydraulophone here to spray a stream of water in Australia!

All in all, it was a pretty fun and inspiring day. We finished off by checking out the lobby of advertising agency Cossette. With a little persistance, perhaps we can hold our IMM open house there!

Below is a short video of some people enjoying the hydraulophone at Nuit Blanche.

February 11, 2008 at 11:17 pm Leave a comment

Mobile Technology

Hello! This is my first post for 2008, and we’ve now begun our second term of the Interactive Multimedia program at Sheridan College. Last term in multimedia pioneering we focused on big, physical interactivity like motion capture, interactive walls and surfaces. This term, the focus has shifted to small interactivity, which leads me to the topic of this post – mobile technology.

We had a visit from internet architect, James Eberhardt today, he was a very captivating guest speaker and gave a presentation on some emerging technology with respect to mobile devices. Of course, the most exciting and talked about mobile device today is the Apple IPhone. Personally, I can’t wait to be gainfully employed again just so I can afford to buy one of these hot gadgets! Its features include a touch screen where you can simply tap on a friend’s name to call them, and it also allows you to sync your entire contacts list with your Mac or PC. You can even choose the order in which you want to listen to your voicemail. Beyond its phone features, you can also listen to music, podcasts, and watch videos easily by syncing to your itunes library. You can even surf the net via a wi-fi connection so it’s simple and cost-effective to check email, browse websites and even upload pictures taken with its built-in camera.

Speaking of uploading pictures, James also showed us a really cool application called ShoZu which takes the gps information from your mobile phone so that when you upload a picture via the phone. Then when you view your pictures on Flickr you can see on a map exactly where the picture was taken. Very cool for people who are travelling because it is sometimes difficult to remember where some pictures were taken, but this way application takes care of remembering for you.

In terms of developing and prototyping mobile phone applications, James discussed a few important points. Firstly, much to my chagrin, it turns out that Flash Lite is not AS3 compatible! James demonstrated how useful and cost-effective it can be for prototyping, because you can test your application on several virtual cell phone platforms. However, there are some other programming languages that are more commonly used for development. One of these is the open source Mobile Processing which is a Java based language with the same design goals as the Processing project. The project was created to teach fundamentals of computer programming within a visual context and to serve as a software sketchbook and professional production tool.

Perhaps most exciting of all that James talked about is the potential for QR codes. These are electronic bar codes which store addresses and URLs. They may appear in magazines, on signs, buses, business cards or just about any object that a user might need information about. A user having a camera phone equipped with the correct reader software can scan the image of the QR Code causing the phone’s browser to launch and redirect to the programmed URL. Below is a demonstration of how it works:

Overall, there are some pretty interesting developments with mobile phone technology, but it also has it’s limitations, mostly because of the size of the screen – what would be the point of watching high quality video on a tiny screen? Or, because of the small buttons on the keypad, most people prefer to type as little as possible on the keypad so that definitely puts some limitations on internet browsing. It appears that QR codes and the Iphone are addressing this problem by creating technology that makes browsing less troublesome for the mobile phone user.

January 31, 2008 at 2:23 pm Leave a comment

Flashy Flash

Last week in our multimedia pioneering class, we had a guest speaker – Simon Conlin, who is an interactive strategist. Mr. Conlin is involved with Flash in the Can and Flash in TO, he works as a consultant who dreams up innovative interactive initiatives such as branding, from traditional, web, mobile, on-line Video/Audio content and new media events.

Mr. Conlin showed us what a number of multimedia pioneers are up to. The most impressive was the work of Zach Booth Simpson, an engineer, artist and molecular biologist! Below is an example of one of his exhibitions:

What impressed me most about Zach Simpson is that he’s combined his knowledge of multiple disciplines to illustrate scientific principles. There’s great potential for interactive multimedia as a teaching tool, and these tools could be extremely valuable to people like yours truly, who learn better from the “hands on” approach.

Mr. Conlin also discussed a German company called MESO who specialize in complex web applications, multi sensory environments, on air design, and interactive stage sets. For example, they built interactive stage sets for George Michael’s Live 25 tour where the lights would move according to the audio and a position tracker monitored his movements.

Mr. Conlin provided our class with a list of links to websites and videos of various multimedia pioneers and their projects. One of these innovations is iBar, which is an interactive surface. It has integrated video-projectors which project any content on the milky bar-surface. There is also a tracking system which detects all objects touching the surface. Users can interact with the projected content, and move things around with their fingers by touching the surface. This is very similar to the interactive surfaces we saw at Gesture Tek as discussed in my previous post.

Probably the most lasting impression I received from Mr. Conlin’s visit is that the most important thing in interactive multimedia pioneering is team work. Very few people can come up with this stuff and run with it all on their own. Pretty much all the companies/projects we looked at started with small teams. I have definitely noticed that team work is important as we develop our client projects. In my team, we have five people all with different skill sets and it’s great to learn from each others ideas and experience, and also for each of us to be able to contribute something unique to the group.

Following Mr. Conlin’s discussion, we also got to see a presentation from our in-house multimedia pioneering specialist, Dan Zen. He gave a very animated presentation about Focuso – the art of intentionally shooting out of focus. This is a good example of how a relatively simple concept (shooting out of focus) can be made exciting. What is really nice about it is that anyone can do it – most of the impressive interactive multimedia we’ve seen thus far has involved expensive equipment and elaborate set ups, but this is simple, inexpensive and readily accessible. Furthermore, it encourages you to look at the world a little bit differently, perhaps to find the beauty that lies in a crack in the wall.

FYI: Simon is doing a networking bash on Thursday. http://www.flux.to – I’d love to go to this, but alas, I am in Calgary!

December 12, 2007 at 5:45 am Leave a comment

Gesture Technology

Imagine being able to control your computer or other household appliances with a wave of your hand…well, that technology is already here and if you didn’t realize it, it’s probably just because it is not widespread just yet.

A couple weeks ago my Interactive Multimedia class at Sheridan College went on a field trip to Gesture Tek a company who are pioneers in the field of gesture technology. Gesture Tek was co-founded in 1984 by Vincent John Vincent who was a travelling musician/performer at the time and used cameras as interfaces to generate a multimedia experience for his audience. He would play virtual instruments on a big screen as part of his performance. His first prototypes were made on my personal favorite computer of the 1980’s – the Amiga!

Over the years, Gesture Tek have expanded their product line to include the following:

Gestpoint – which includes surfaces where you can point and control images with simple hand movements which are detected by a camera and then processed by a computer in a similar manner to mouse control. Similar technology has been used by Microsoft in their new product Microsoft Surface

Image and video hosting by TinyPic
Multimedia Pioneering guru Dan Zen using Gestpoint technology – photo taken by Dwight Brown

Gesture FX – interactive walls and floors that the user can wave at or step on to make things happen, this technology is useful for marketing/advertising because it engages the user more than a print or tv ad would.
Image and video hosting by TinyPic
Above is an example of a running race game using gesture fx, players wiggle their feet, the camera captures their motions and the computer translates this information so that their animated character is running.

Gesture Health – Gesture Tek has used their technology to help patients with physical disabilities rehabilitate using interactive environments tailored to their specific needs and range of motion.

ScreenXtreme – This was my personal favourite, you can stand against a green background facing a camera that captures your image and places it into a virtual environment and your movements are displayed in real-time within the virtual environment. I tried a game similar to Harry Potter where I could shoot sparks from my hands and even fly around – it was hilarious!

Gesture technology is often used in museums in interactive displays – at the Boston Children’s museum, users can pretend to be the conductor of an orchestra using a device that resembles a Wii and gesturing as if holding a baton. The faster their movements, the faster the tempo of the music.

Overall, the trip to Gesture Tek was really interesting and super fun. I really think the clubs in Toronto should start incorporating the interactive walls and floors, it would make for a really great dance party! I guess the only downside to this technology is that it is not really affordable for the average person, but since Gesture Tek is moving into the mobile phone realm now, I expect it will become widely accessible in the very near future.

October 23, 2007 at 5:12 pm Leave a comment

Visualization Design Institute

Last thursday my classmates visited the Sheridan Visualization Design Institute. Unfortunately, I missed this field trip because I was playing with my band ohbijou at the Pop Montreal International Music Festival. However, the rock star life has not excused me from my duties as a student, so below is a summary of my research on the topics of immersive theatres where multiple users control the outcomes, and on Face Recognition Technology.

At the Visualization Design Institute, one of the projects they have developed is the Collision Investigation: skid marks which is a series of animations and simulations designed for police officers to practice determining the speed of a vehicle based on four different kinds of skid marks.

Science museums around the world are embracing immersion theatres as an interactive means of teaching science. One example is at the Miami Science Museum where students learn about human biology using robots to touch screens and compete to save their team members through a series virus fighting games.

One of the first immersion theatres was developed in the electronic visualization library at the University of Illinois in Chicago in 1992. It is known as CAVE (cave automatic visual environment) and uses several cameras pointed at some or all of the walls of a cube. The user wears special glasses to see the 3D images projected on the walls and their movements are tracked by a computer which adjusts the projected images accordingly. Even the audio is projected from speakers aligned in different directions to generate a total 3D experience.

Immersion technology is often used to create sports simulation games, in which multiple users can compete against one another and a virtual ball responds to their movements. Some of these interactive environments are so user friendly, even animals can use them!

Looking towards the future, another application being researched at the Visualization Design Institute is the Facial Animation Communication Engine. It is a real-time tracking system which extracts a set of facial animation control parameters from video input of a human face. This application could be extremely useful for security purposes, but raises the issues of privacy, and mistaken identities. It would need to be almost error proof in order to hold any value in terms of the law. However, it is already in use at XID technologies in Singapore, controlling access for 6,000 blue-collar workers, in and out of the building, 24/7. I guess no one there can send in a substitute to do their day’s work for them!

October 10, 2007 at 4:32 am Leave a comment

Interface me

Hello! My name is Anissa Hart, I am currently studying Interactive Multimedia at Sheridan College. This is my blog. I will use it to chronicle my adventures in multimedia pioneering.

September 13, 2007 at 9:28 pm Leave a comment


Categories

  • Blogroll

  • Feeds