Thursday, December 2, 2010

Evolution of the Human Processor

More and more frequently, I hear people talking about the importance of processing huge datasets. These grand ideas often push the limitations of technology. I can understand the urge to pursue these ideas, but I feel that the mentality is misguided. Mother nature worked out a long time ago that processing data in this way wastes precious neurons. Like us, nature faces a limitation of resources, both in storage capacity and processing power. I've heard (admittedly, from a dubious source) that the human brain has a capacity of about 4 terabytes. This may sound like a lot, but it is not when you consider the storage requirement for all a video and audio over your whole lifetime. So how does the human mind deal with this flood of information?

By prioritising knowledge. Think about the structure and operation of the human eye. We have central vision, with which we continuously change focus between things that we consciously or subconsciously decide is important. We also have peripheral vision which allows us to detect important information within our current view. Between these two extremities, we have a gradual transition between very high resolution and very low resolution.

What we don't have is: a uniform grid of retinal receptors to process data from all directions equally. Unlike a bug, we have evolved to assign attention to very small subset of the information that surrounds us, and then to seek out further detail.

Over millions of years of evolution, nature has determined that all information should not be processed equally. It does this by extracting a large amount of information from a small field of view (think hunting), and only processing small amounts from our peripheral vision (sudden movements).

So how can we apply these observations to real-world problems?

This philosophy should be applied to any high-volume data processing scenario. Just remember a few concepts:

1) Devise rules for deciding what characterises valuable information
2) Formulate efficient methods to extract the valuable information.

In my next post I will discuss techniques for improving algorithm efficiency.

James.

Tuesday, November 16, 2010

Displaced REST Map Service in Flex Application

The Scenario:
Recently, I've been doing some development with the ArcGIS API for Flex and ESRI's open-source "Flex Viewer" application. The application includes several widgets which work out-of-the-box. After configuring an application with the basic widgets, I got to the point where I needed extra functionality. In this particular situation, I wanted to be able to load external services (eg. ArcGIS REST or WMS) at runtime. I decided that the most appropriate approach would be to extend the "LayerList" widget.

The Problem:
The modification appeared to be working fine on the development environment. I could load several layers, both REST and WMS, which displayed beautifully.
However, when the flex app was migrated to the production system, I started to notice some issues. The application was fired up, and I loaded an external REST service. It seemed to be working, but when I maximised the window, the data no longer aligned with the basemap. At first I thought that my code had modified the spatial reference for the layer.

The Solution:
After some investigation it occurred to me that the top left corner of the image was in the correct location. I also realised that the returned image was over the correct extent. It turned out that the image which I was requesting was larger than the maximum image size configured for this service. (The default for ArcGIS REST services is 2048x2048 pixels.)

The solution is to modify the service configuration file. On windows servers: C:\arcgis\server\user\cfg\service_directory\service_name.cfg
Simply update the values for MaxImageHeight and MaxImageWidth. I set both of mine to 3072, like so:
<ServerObjectConfiguration>
<Properties>
...
<MaxImageHeight>3072</MaxImageHeight>
<MaxImageWidth>3072</MaxImageWidth>
...
</Properties>
</ServerObjectConfiguration>


I encountered a similar situation with WMS Services, but unlike the REST services, the limited image size (2048x2048) was positioned correctly.
I am assuming that the WMS services are processed more easily because the maximum file size information is visible to the client in the GetCapabilities file. ArcGIS REST services do not expose this information.

More information about ArcGIS Service configuration files can be found here:
http://webhelp.esri.com/arcgisserver/9.3/java/index.htm#cfg_file_service.htm

Thursday, November 4, 2010

Augmented Reality in Commercial Applications

ANZ Bank have recently released an iPhone app for real estate analysis

I have to say, the application looks amazing, one of the best I've used. The user can search for houses on the market by location, price range, etc. Then the results can be displayed in three different ways:
1) A list (boring)
2) As points on a map (good)
3) In an augmented reality view (great)

The map view is cool, but is becoming expected within these kinds of applications. The thing really impressed me was the AR view. Augmented reality uses phone location (GPS + compass + accelerometer) to add information to the camera display. In this implementation, ANZ have presented the real-estate information to the user in a format that they can relate to (more so than a map). It is a bit of a gimmick, mainly because it heavily relies on the compass sensor at this stage, but impressive none the less.

What would be really cool, would be to see some live GPS data streaming to the display. For example, someone is waiting for their bus, so they open the AR display and can see that they have missed it by a few minutes. Or to interact with nearby friends, Twitter would become more like a phone call and less like a messaging service.

It's great stuff and am looking forward to seeing other implementations. I've already used the Layar app to overlay Wikipedia articles and other sources on the camera display. Let me know if you know of any other good ones!

Wednesday, November 3, 2010

Flex Sandbox Security Error #2048

I came across an annoying limitation of Flex today. After deploying my ArcGIS Flex application to an IIS server, I was having trouble accessing a Web Map Service (WMS). I could run the application locally and access the remote services, but deploying the application resulted in a Sandbox Security Error #2048:

"SecurityError: Error #2048: Security sandbox violation: http://mymachine/myflexapp/index.swf cannot load data from http://myserver.mydomain.com/arcgis/rest/services/myMapService/1?f=json..."


After perusing various AtionScript boards, I was finally directed to the solution on the ArcGIS Server 9.3 online help:

http://resources.esri.com/help/9.3/arcgisserver/apis/flex/help/index.html

"To access data from a different server than the one hosting your Flex application, the remote server needs to have a cross-domain file in the root directory."

As I understand it, this helps to prevent cross-site request forgery. Fair enough, but it's still annoying that Flex has this requirement, while many other client apps do not. My Flex application accesses the WMS service just like any other. The difference is this:
Web applications operate in a common environment, your browser. This environment shares common resources, such as cookies, which is a security concern.

Short of publishing the application as a standalone AIR app, I must rely on configuring the WMS server to allow access. It seems that this goes against some of the objectives of the OGC (seamless interoperability). I suppose javascript applications manage this is a similar way...?

Initially I was concerned about opening security holes, so I checked some well-known websites for examples on how to configure the file:
http://www.youtube.com/crossdomain.xml
http://www.google.com/crossdomain.xml

I was able to resolve the problem by asking the WMS Server administrators to set up the crossdomain.xml file at the root level of their website as described on the ESRI site. An example crossdomain.xml might be:

<?xml version="1.0" ?>
<cross-domain-policy>
<allow-access-from domain="*" />
<site-control permitted-cross-domain-policies="all" />
<allow-http-request-headers-from domain="*" headers="*" />
</cross-domain-policy>


This may not always be possible so an alternative approach is to configure a reflector on my own web server. Basically this will forward requests to the real destination, but to the web app, they appear to be coming from my own server. Obviously, copyright issues need to be considered when setting up something like this. The benefit of having a reflector is that you only need to configure a single crossdomain.xml file on the web server.

Monday, September 20, 2010

Amateur Radio and Python GDAL

I have installed GDAL python bindings to my Karmic Ubuntu box and my MacBook. I intend to test this Python library on the Landsat imagery which I have downloaded (Canberra and surrounds of course). There were a number of sample gdal-python scripts included with the MacOSX download. I still need to check these out.

After a number of Amateur Radio websites have caught my attention, I came across the linux software "GPredict". This software provides a mapping interface which displays the locations of hundreds of satellites. Also, it includes a window to configure a radio device. I shall investigate further.