neWMW

Posts Tagged ‘Open Source

Vers Geperst: Fresh juice for the brains

leave a comment »

Last week I attended the Vers Geperst meeting at Club 11 to tell the audience about the Masters of Media weblog project. Based on the Pecha Kucha presentation idea, all the presenters had only 11 slides of 12 seconds each to propagate their views.

fidel twan

Besides the Masters of Media blog, some other interesting ideas were pitched. What about the already famous ‘Whatever’ button? Are you tired of all the times you have to agree to useless legal information? Just install the Whatever button Firefox extension and you don’t have to worry about all the nonsense anymore! Although I had already installed it before the presentation, Michael Stevenson’s talk and imagery still gave me stomach cramps from laughing.

Another idea came from the guys from ToxTox TV. According to the creators ‘ToxTox is the next generation internet television platform. It allows you to watch video content from your couch, on your tv, using only open software.’ An ambitious idea with lots of opportunities and I’m anxious to see how this works on my television.

The Open-Search project, presented by Erik Borra, focusses on the role of privacy and search engines. The Open-Search project steers away from the centralized powers of the corporate search engine and provides ‘an exemplary peer to peer, collaborative event, whereby people mutually form a search engine without the intervention of central servers or a central actor.’ Definitely worth checking out.

The last presenter I want to mention is Anne Helmond, who is responsible for the lovely Fidel Castroian picture of myself in this post. She presented her photography and also a project she did on drapes and windows. More photographs of the meeting are available on Anne’s Flickr account.

Advertisements

Google Geoday Benelux 2007 Report

with 7 comments

MarkerWhen the invitation for the Benelux Google Geoday 2007, shaped in the form of the Google Maps marker which has risen to fame in the past years, landed in my mailbox it promised to become an interesting day in Amsterdam’s EXPO XXI this Thursday. With presentations by Bernard Seefeld (Google/Endoxon), Brandon Badger (Google) and Remco Kouwenhoven (Nederkaart) in the morning and workshops on Google Earth and the Google Maps API in the afternoon. With a big thanks to the people at Generation Next who were responsible for my ticket in the first place.

Google’s Geo development (Google Earth and Google Maps) has taken a big step in the previous years, with the coming of Earth and Maps there is a definite focus on adding layer after layer of information on the globe. Like graffiti on a wall everyone can apply meanings to the maps made available by the Google Geo team. As Lev Manovich noticed in The Poetics of Augmented Space: Learning from Prada when talking about Augmented Space: The 90s were about the virtual, the 2000s will probably be about the physical. Not the infinite Internet, but the finite space of the physical land. And it seems that Google eagerly agrees with this prophecy. Below is my account of the day.

Keynote by Bernard Seefeld
Dangerous dragons were used in the early days of mapping if parts of the map were not known yet to the cartographer. This is one of the examples Seefeld uses from the early mapping practices, which stands for the improving of the image and the filling of the gaps. The early cartographers did not have the information to fill in the holes so they just drew dragons. Another example given is the map of New Holland, or what is now called Australia. The interesting thing is that the Portuguese probably reached this land first, but the Dutch were the first to draw a map of it and therefore being ‘responsible’ for the discovery of the land (a discovery from a Western perspective, anyway, as the Aboriginees were already there).

New HollandSeefeld takes the first example and also the mapping of New Holland, which was not accurate at first to say the least as the shape of the land on this image shows, to the present. He notices that the Google Geo team faces a similar situation in pasting together the best available satellite imagery to create the globe of Google Earth. Sometimes the information is available, sometimes not and in that case lesser detailed imagery has to be used. It is not about drawing dragons, but about improving satellite images.

So now we have an explanation of Google’s basemap philosophy, pasting together a globe in a way that perhaps reminds us of the patched body of Frankenstein. Building on top of that basemap is the next step and this is also the core of the Google Geo team’s stated mission: Organize the world’s geographic information and make it universally accessible and usable. Which is derived from Google’s overall mission statement, which is actually the same but doesn’t include the term geographic.

GoogleAntInstead of discovering new land, like in the age of navigation and in the second example given above, the user is now able to discover new information, the era of the information age. The emphasis here was especially on the enhanced content applied to the base like web links, reviews of places, photographs and featured content. Seefeld actually went as far as too say that the base is nice and the content makes it great. The base is always the same, just like the physical. But it is information and meanings applied to the physical space that make it what it is. The new idea is that meaning is applied with the use of the virtual, leaving aside the physical. Through the geo applications new discoveries can be done in the physical space, as an example Seefeld showed the Google ant; a species that got discovered with the help of geo applications.

But all this information applied to the surface of the Earth requires a way to search the information. This is the territory of the spatial web, which is all about geotagging, KML and more. Making the meanings applied to the Earth searchable. But nowadays discovering the earth isn’t as dangerous as it used to be. The dragons are gone; discovery has become a safe practice. Boring? Perhaps. You can always try the navigation option in Google Maps and Earth and follow the directions, even if it asks you to swim across the Atlantic Ocean.

Dragon Map

What is fascinating is the applying of so many meanings to the finite globe with the help of this virtual reality. I asked Seefeld the question what his views are on potential conflicting meanings. He emphasized the role of the user and also said that it is important to have access to all opinions. Getting everything 100% true is very difficult but the goal is to fix it again and again until it is good, with the help of user opinions. This really reminded me of the already infamous Wikipedia wars, which is more about events, persons, etc. while Google Earth is about space and meaning. As Dorling & Fairbairn say in the chapter Alternative Views from their book Mapping: Ways of Representing the World: ‘Maps have always presented pictures of ‘truth’ and just as many people have many different truths, so there are many maps to be drawn.’

“From API to mashup” by Brandon Badger
The key to presenting all these various views on planet Earth and what a website developer can show his/her visitor are mashups. Using the base map and applying content, meaning, to it. Badger emphasizes the essential role of the user and giving us a rather simple and commercial equation: Google’s tools + You = Victory. A more convincing model for the concept of the mashup was that the sum of the two parts makes for something more valuable than just the sum of the two parts: 1+1=2,53542. I guess it is a good thing Time magazine named us as persons of the year, but it also makes us a lot busier with supplying content for Web2.0 applications. When will “we” get too busy with supplying content until the point that we don’t want to do it anymore? It will probably mark the end of Web2.0: The death of the user.

Mashups by Remco Kouwenhoven
On his website Nederkaart.nl Remco Kouwenhoven shows lots of examples of mashups with the use of the Google Maps API. He showed us some of these on the screen, but the one that struck me the most was this map about the air traffic above Schiphol. What it intents to show is the high density of airplanes at Schiphol airport paired with complaints about the noise.

Schiphol Noise in Google Earth

This reminded me of a remark by Mark Monmonier in the Dorling & Fairbairn piece I already mentioned above: ‘Cartographic propaganda can be an effective intellectual weapon against an unresponsive, biased, or corrupt bureaucracy.’ These mashups can provide this cartographic propaganda in real-time. Current issues can be addressed with the help of real-time information gathering. On Kouwenhoven’s website a lot more examples can be seen and it is a definitely worth browsing and importing some of the examples into Google Earth.

The Workshops
After a morning of presentations the afternoon was reserved for us, the user, to start creating content using the tools supplied by Google as Badger pointed out. Although I’m not sure for how long these links will be online you can check out the small assignments of the workhops at these links: Google Earth workshop and Google Maps API workshop. More technical info is also available through code.google.com. There was one jawdropping example in the Google Earth workshop that I didn’t know about, which is an incredibly detailed 3D city model of Berlin. Definitely a must-see.

Hauptbahnhof Berlin 3D

After spending two hours being immersed in the representation of the physical space on the screen, the pavement on my way to the train station also had some new meaning applied to it. A strange awareness of how easy meaning can be applied to the physical space we navigate each day, or to the places where we live. Being unaware which meaning has been applied in the virtual to the places we call home. What also struck me after this day is the dependence of Google on the user, who is responsible for supplying the content. It makes you think, but for some reason I’m just feeling lucky right now.

Incompatibility in Protocol: E-mail sent from Thunderbird sometimes doesn’t arrive at Hotmail

leave a comment »

In my post about incompatibility below, I wasn’t that surprised that some things just don’t work under Linux. But with Windows XP now installed again and with Mozilla Thunderbird for my e-mail, I really didn’t expect anything to be incompatible.

Untill I tried to send an e-mail from my student account (in Thunderbird) to my Hotmail.com account. It just didn’t arrive, again and again. This makes you think: Mail is supposed to arrive at the receiver computer, right? But there is also a business in that doubt, just think about the response message for succesfull deliveries. Did they get it? In this case, I was doubting if the mail I sent to Hotmail adresses over the past month actually even arrived.

So I started searching the web to find some answers, because I thought a small setting in Thunderbird was going to solve the issue. But it turned out to be something that wasn’t solvable with just the click of a mouse.

Check out this post by Daifne from the MozillaZine forums:

Problem:
Intermittently, can’t send messages to hotmail.

Solution:
Insert HTML formatting in message and vary message contents.

Factors:
It looks like a Baysian e-mail filter between Bell Sympatico and Hotmail is being used to automatically delete e-mail messages. They DO NOT show up inthe Junk E-mail folder on the hotmail account.

Baysian filters use a technology that weights a number of factors to
determine if a message will be sent. +1 represents likelyhood of
uccess. -1 represents likelyhood of failure. If you score low on too many categories, the e-mail will be deleted.

The discussion this post sparked on the MozillaZine forums of course took the form of an angry mob trying to burn down Bill Gates’s house: “This is only an issue when sending to Hotmail accounts and that is exactly what Microsoft is trying to get you to do here. Are you going to fall for their fraudulent business practices?” But where does the cause of incompatibility lie? With the sender or the receiver, or perhaps the signal itself? The willingness to make compatibility possible? Microsoft’s alleged monopoly position doesn’t speak for the company. But the ideal of free software on the other hand also forces expectations that can not be met on companies that employ thousands of people.

But back to the solution for my problem? The first sounds a bit strange, if the content of your message is varried enough it does get through. But if you just want to test your account with a small messsage, chances are big that it doesn’t come through.

Luckily I have a Hotmail/Windows Live account in Thunderbird and use the localhost/Hotmail SMTP account that is created by the Webmail extension to send all my email through. It works, but it is a work-around and not a shining example of compatibility. With thanks to this post from Ambiguity on the DI Forums Board. More info also on this page by Ian Gregory called The Black Hole Called Hotmail. Pictures from Wikipedia.

Kubuntu Experiences: Cisco 350 and the grasp of Windows’ compatibility

with 6 comments

X30Yes! I finally have a laptop on which I’m typing this new blogpost! The beautifully small, although not very new, Thinkpad X30. Exactly what I was looking for and it came pre-installed with Kubuntu 6.10, which I wanted to try out for quite some time. An overview of my experiences with the Open Source operating system. And to spoil the ending: Why I had to go back to Windows XP.

The Wow Effect
When I first booted the laptop and saw Kubuntu with the KDE desktop for the first time I had a ‘Wow’ experience Vista couldn’t top. Not really because it looked so good, it actually did, but this wasn’t the effort of thousands of paid employees but of the masses. Every connection working, every click I did was made by people who wanted you to click and connect for free. The people working together to create a system that is actually free. It is the effect of the GNU manifesto, the call for the sharing of software and not keeping it under (distribution) control.

After this first experience I had to change my way of thinking in dealing with an operating system. So far my only experience are with DOS in the early days and of course Windows with a little bit of MacOS experience mixed in. After some clicking and searching the web (the internet worked ‘out of the box’) I learned how to install a programme. It felt like I had to learn how to walk again. I felt, well, stupid really. But after reading the Kubuntu documentation I learned how to use Adept to install packages and also manage repositories.

kubuntuNext up was how to play the various media filetypes. Since most are protected formats these don’t come supplied with for example AmaroK, but the Seveas package turned out to be a lifesaver. It supplies every codec I need to play various mediatypes like DVD and lots more. But since this wasn’t supplied through Adept, I had to look up how to do an install from a .tar package. It took me some time to figure out, but lets say it comes down to these simple terms: ./config, make, make install. So far so good.

Everything was working fine, I also found a very addictive game to play under Kubuntu called Battle for Wesnoth, and I really had the feeling that I was part of something special. Part of a group of users who are aware of what software in the digital age is really about: sharing.

The Grasp
But then the main problem came up. The bad guy. The pure evil. The one thing that can beat all goodwill of the open source community: Incompatibility.

Let me elaborate. At home my wireless connection worked fine with the Cisco Aironet 350 Mini Pci WIFI card that comes installed. Although it is an older model and is a 802.11b and not g, it works great. Untill I went to the University of Amsterdam, for the first time. Because to access the UvA network under Linux/Kubuntu I had to use the WPA encryption. For two days I tried to connect, upgrade, install and check again. I tried it all, HostAP, ndiswrapper, wlan-ng, WPA_Supplicant, Knetworkmanager and more. But they all failed. They were my Kubuntu Waterloo. And deep in my heart I really, really, really wanted it to work. Because I wanted to live the completely open source lifestyle.

But I needed the internet connection at the UvA. Compatibility overruled personal ethics. As I found out, the only option for my Cisco 350 to connect via WPA encryption was by getting a firmware upgrade… which is only available through… Windows. So I’m very sorry if you were reading this post hoping me to say that it is possible to get WPA on your card. I’m sorry…

The incompatibilty turned out to be the struggle that the open source community is fighting against. Every time that the corporate software distributors copyright a new portion of their software, the open source community has to find a way to make their operating system to be compatible with those standards. If the protocol does not match, there is no communication. The copyrighted compatibility is a serious issue and it caused me to leave my newfound glory and go back to that operating system everyone uses. The system that conceals the abilities of the open source community. The code curtain.

Meanwhile… back in XP
As I’m typing this my eyes can’t escape the returning presence of the XP start button. The much critized 5 letters didn’t return in Vista, but they’re not so bad. The biggest change in my use of Windows is that the extra programmes I use are almost all open source. VTC media player, GIMP, Firefox, Thunderbird, Open Office. The best thing I got out of this is user awareness. We have to be aware of the limited nature of (corporate) controlled distribution of software. It is good to see the alternatives, the margins that fight the giants. And in turn the margins influence those giants. Those see-through menu’s from Vista look awkwardly familiar, don’t they?