Skip to main content

Synching the 2.0 web

Advocates of web 2.0 suggest that we can access nearly all of the services we need from web suppliers. We can edit our documents, store our photos or company data, and run our applications. It sounds great - but what happens when the web is unavailable? Over the last few years I have travelled quite a bit and I've often found myself in places with no wifi connectivity - or at least none at a price I'm willing to pay. So I value having a copy of my data on my laptop, so that I can carry on working.

I've put forward this argument at a couple of events recently. At an excellent session on Web 2.0 and science at the UK e-Science All Hands Meeting, the response was that 3G coverage will soon be sufficient to give us access almost everywhere. The next generation will take it for granted, the way they take GSM talk coverage for granted already. I have to admit that this scenario seems quite likely, although of course there are still places that don't even have talk coverage.

Nevertheless, there are still problems. Cloud services are far from 100% reliable, at least as yet. The word from companies using cloud computing for their business is that we should expect failure and deploy applications on multiple providers. I believe we should do the same with our data. In addition to guarding against technical failures, it would protect us from vendors who go out of business or close down a service. It would would also prevent vendors from taking advantage of "lock-in" to increase their prices.

So, we need systems that can replicate data from one data store to another. Fortunately, we know how to do this, whether via Grid or via P2P technologies. Unfortunately, we seem no nearer achieving standards for interoperability, so we will need to build systems that interface to the variety of proprietary systems out there.

Ideally, the data should be self-describing, so that two copies can be synchronised by a different application from the one that actually created the copies. I'm put in mind of the apparently simple problem of syncing my calendar between my PDA and my PC. When I migrated my PC calendar to a new application, the next synchronisation created two copies of each event. You'd have thought that the iCalendar format would tag each event with a UUID so that multiple copies could be easily reconciled, but it seems that this doesn't happen. Let's make this a ground rule for storing data in the cloud.

I'll leave the last word to a panellist at the Cloud Computing event in Newcastle. When I explained that I wanted my data on my laptop so that I could work on the plane, he suggested that perhaps I'd be better using the time to read a good book.

Comments

Popular posts from this blog

Presentation: Putting IT all together

This is a presentation I gave to an audience of University staff: 

In this seminar, I invite you to consider what the University’s online services would be like, if we worked together to design them from the perspective of the student or member of staff who will use them, instead of designing them around the organisational units that provide them. I’ll start with how the services might appear to that student or member of staff, then work back from there to show what this implies for how we work, how we manage our data, and how we integrate our IT systems. It might even lead to changes in our organisational structure.

Our online services make a vital and valued contribution to the work of our students and staff. I argue that with better integration, more consistent user interfaces, and shared data, this contribution could be significantly enhanced.

This practice is called “Enterprise Architecture”. I’ll describe how it consults multiple organisational units and defines a framework …

Not so simple...

A common approach to explaining the benefits of Enterprise Architecture is to draw two diagrams: one that shows a complicated mess of interconnections, and one that shows a nicely layered set of blocks. Something like this one, which came from some consultants:


I've never felt entirely happy with this approach.  Yes, we do want to remove as much of the needless complexity and ad-hoc design that litters the existing architecture.  Yes, we do want to simplify the architecture and make it more consistent and intelligible.  But the simplicity of the block diagram shown here is unobtainable in the vast majority of real enterprises.  We have a mixture of in-house development and different third-party systems, some hosted in-house, some on cloud infrastructure and some accessed as software-as-a-service.  For all the talk of standards, vendors use different authentication systems, different integration systems, and different user interfaces.

So the simple block diagram is, basically, a l…

2016 has been a good year

So much has happened over the last year with our Enterprise Architecture practice that it's hard to write a succinct summary.  For my day-to-day experience as enterprise architect, the biggest change is that I now have a team to work with.  This time last year, I was in the middle of a 12-month secondment to create the EA practice, working mainly on my own.  Now my post has been made permanent and I have recruited two members of staff to help meet the University's architectural needs.

I have spent a lot of the year meeting people, listening to their concerns and explaining how architecture can help them.  This communication remains vital, the absolute core of what we do and we will continue to meet people in this way.  We also talk to people in other Universities in order to learn from what they are doing and to share our own experience back.  A highlight in this regard was my trip to the USA last January.

Our biggest deliverable for the past year was the design of the data wa…