Boy, some amazing comments in the talkbacks to E3G. I am just overwhelmed with some of the ideas, and I am trying to fit them somehow into the framework in a way that enhances the picture before I move on to the next part of my E3G thesis. I realized as I was explaining the blog entry to some of you that I was really describing a polarization of each layer - from a "one size fits all" model of E2G into a "fit for need" hybrid model in E3G. In a sense we moved from this graph
on every component layer.
Let me illustrate better by looking at the data layer - Instead of putting all the data in a single relational store in house (too expensive for exceptions and too slow for google-like fast access to large collections) we will be seeing a shift to remote storage for exceptions and shared data a-la S3, and fast access locally to in-memory data warehouse which is still cheaper than relational (when you account all dba human operational cost). SImilarly, when you look at every layer you see the same polarization:
- Cloud computing in conunction with high availability local appliances
- As I just explained, cheap remote stores next to in-memory high performance dedicated stores
- Open XML business event networks routing into an internal highly-optimized (as in XML bypass) bus for object routing and handling in the network itself
- Desktop with smart browing client next to smart mobile device with personal awareness and intelligent routing
Now, one of the smartest comments started building the next axis of the framework which is the experiences we see across these components. I just want to quote some of the text from that talkback before I post the full view on the second axis (mostly because I have not formulated it all yet - I am busy on the alt.energy side of the hoouse these days...). Here goes, from Armando -
"...I was picturing your layers on a slide with four blocks, one on top the other, Store, Compute, Messaging, Presentation. There is one block that to me is just crying out to be drawn vertically alongside: People..."
and
"Gen3 moves even more power and control into the hands of end users. Distributed computing is enabled and complete business objects live on the network. Tools like Visual Composer evolve and live up to their full potential allowing the creation of ad-hoc applications without compromising the core platform integrity. The user interface is split completely from the underlying rules and data. Beside new user interfaces I can also imagine the development of specialized enterprise bots that interact autonomously (or semi-autonomously) with the system."
Read the rest of his comment - it's worth the time. I think this is probably an aggregation of the view I have about the second axis. My view currently breaks the second axis of the framework across multiple different experiences as the users end up tackling a much more complex world (in the physical world with high volume of events and less time to process), the systems are getting more proactive, with more complexity in IT and but with a lot more power and resources availabale.
- In that sense the users are going from Data Entry to Exception handling and co-authoring
- Systems are going from Aggregation to harmonization and more importantly Automation
- As structure is added into information, its ability to route intelligently increases by orders of magnitude, hence the value-add of data moves away from the store and into "right-time routing"
- forcing developers to go from transactional programming to Event centric resolution
As you can see, this is not as formulated as the previous axis, but you triggered some great questions - so keep shouting.
Shai
Hi Shai,
good stuff. For the "people axis" i think we'll see a completely new fragmentation of skills and roles alongside that blocks, especially in the upper area:
- Low End: People who use services like del.icio.us, Freebase etc. and want direct, "command line" style access to their systems.
- Middle: People who don't want to deal with the inner workings but customize their workflow using things like Visual Composer or Teqlo (http://www.teqlo.com )that harmonize and aggregate (currently the Excel Macros afficiandos).
- Upper: The users that use that cool new app/widget from their coworker to get their job done in a better way.
All the necessary tools are in place or just around the corner (shameless self-plug: We (http://www.systemone.at/screencast/eng ) use S3 as storage for larger files, an in-memory triple store on appliances for metadata, ec2 and cheap redundant European boxes for various application components like a 3 TB news clipping index, but no matter if you use the web, mobile and soon desktop client you don't notice that you're in fact dealing with a secure system spanning dozens of locations).
But the most important question was also raised by Armando: What does it mean that we have to start taking "machines" serious as stakeholders as they start to become an active part in the whole ecosystem? What does it mean for businesses, when a company (http://infoproc.blogspot.com/2005/11/simons-thorp-and-shannon.html ) is successful, that on some days single handedly is responsible for a double-digit percentage of the whole trading volume on some global financial exchanges, and at the same time is proud that no human being makes a buy/sell decision? If it can be done for such complex system, what could the role in SCM, Marketing / Demand Prediction etc. be? If you start imagining what BMW would look like if it was run like Renaissance Technologies and use your model as foundation, with people intervening at some new and select points, that results in an amazing load of new thoughts.
It's great to read you, best,
Bruno
Posted by: Bruno | May 10, 2007 at 08:55 AM
I have a slightly different take on how to divide up the problem space. I see computing problems dividing into two kinds:
- Continuous Computing
- Ad-hoc Computing
Allow me to explain. Continuous computing is a set of functions that need to be performed on an on-going basis. This includes: data routing, business object routing, search indexing, RFID & sensor tracking, etc. Continuos computing will be spread out over the network (done in the 'dispersed cloud') by computers (chips, storage, etc.) that are more or less autonomous. And this type of computing, although largely invisible to end users except as it impacts them indirectly will be a large fraction of computing.
Ad-hoc computing where we look to solve a problem in smaller time window- an obvious example being pulling up a BI report on last 3 quarters revenue data- will require a more concentrated computation experience(or perceived single computer). This includes search, email, business applications like SCM, ERP etc. In this domain, the E3G model would perhaps apply better.
But I do think that in order to come up with the 3G of computing, we need to think of both these classes of computing.
My 29 cents.
Posted by: Anshu Sharma | May 11, 2007 at 11:22 AM
I couldn't help but smile when I saw your chart above. A few years ago, I was trying to explain to a standards committee that application boundaries weren't as clear and consistent as they assumed. The diagram I used was very similar to yours, but was actually the top of Bart Simpson's hair. It came to be known as the "Bart Simpson diagram". ;-)
One other possible outgrowth of the "loosely coupled" and "ad-hoc" model is that applications will need to become more resilient. In many programming paradigms, exception handling code encompasses about 60-80% of the logic. Part of the empowerment will dictate that more of this be transparent and handled by the infrastructure and platform, rather than the application.
Extending Bruno's analysis of the "people axis" - it is somewhat like the layering of developers who write device drivers, vs those who write OS's, vs development tools, vs applications, vs reports, and so on.
The skills and "mission criticality" change dramatically as you move up and down that stack.
In our 3G world, the other skill segmentation that I hope we'll see is that of the "service authors" - whether they're data services or functional services, there will be a distinct "art" and knowledge base needed to design for consumability.
Party on.
Posted by: Rick Bullotta | May 12, 2007 at 05:11 AM
Shai,
It took me a day to assimilate what you were talking about. My ideas on these planes are ill-formed as well. But let me take a shot at responding to your post, hopefully in a meaningful way.
The 'fit-to-need' concept is definitely an interesting idea. Yes it is where we are going. But I also believe that assuming that a blue-print for a 'fit-to-need' architecture will come to be some day, would be fundamentally wrong. I've made that error myself. I think there will be as many 'fits' as there are viable markets for 'fitting' products.
So, now if we assume that ‘fit-to-need’ is where the market is going, how do we tap into this opportunity? I think a logical first step is to take a step back and investigate how the nature of information systems have evolved (or are changing) from E2 to E3.
E3 information systems are characterized by a number of unique traits that have never been encountered before. The challenges and opportunities being created today by the Web 2.0 community address the opportunities created as a consequence of these traits. I can at least point to the following (incomplete) set of traits that characterize these new systems:
* Adaptive Architectures - As component with myriad characteristics become available on the internet and cost of integration with/consuming these components fall, architectures that are, as you call it, 'fit-to-need' become meaningful.
* Migratory Infosets - As the representation of information becomes self-describing and complete, they will increasingly have a tendency to replicate and migrate. In sum, data will have a life of its own.
* Virtual or Discontinuous Processes - As internal process become exposed through the internet via standard interfaces, process boundaries become meaningless. The notion of the intranet-extranet-internet, will give way to a ‘ubiquitous net’ or ubinet .
* Participatory User-experience - As the access to information becomes ‘free-for-all’ the user experience becomes co-creative, collective and collaborative. In sum, the user experience is democratized.
I don’t think the list of traits here are in any way complete. So I urge the readers to contribute their 10c or critique on these….
A next step would to identify gaps/limitations/hurdles in the existing systems that prevent these traits, and find new ways to innovate around these challenges. What would also be useful, is a blueprint of the ecosystem of the E3 market that can possibly exist in the future.
... some interesting ideas evolving here. Keep the blog going guys!
--
SS
Posted by: SS | May 13, 2007 at 10:35 AM
One other macrotrend that we may see having a substantial effect on this computing evolution is the so-called "internet of things" combined with extending existing apps with richer real-world awareness.
Whether it is a manufacturing planning module with live status of machines, your coffee maker automatically reordering pods, your vehicle interacting with your PDA/planner and the dealer to schedule service, or longer term, autonomous robots unburdening us of some of the realities in the atom world vs the information world (driving our cars, fighting our battles, running our factories), there are countless opportunities for enabling/extending people and applications with enhanced sensory and processing capabilities, real or virtual.
These are not new concepts by any stretch, but their omnipresence in our day-to-day lives will be transformational in ways we cannot yet imagine.
A brave new world, indeed...
Posted by: Rick Bullotta | May 14, 2007 at 05:26 AM
Rick: Agree with you- the internet of things in my model would fall under "continuous computing", and would by its very nature be dispersed. In fact, sensor-based computing would include not just internet of things but also internet of 'observations' like temperature, humidity, etc. So the cloud can be queried for not only things and their properties but also their environment and location.
Posted by: Anshu Sharma | May 15, 2007 at 11:58 AM