With everyone talking about Web 2.0, I was looking at what is the next generation for the Enterprise space, and how far are we from there. My view is that we are on the verge of the third generation of Enterprise software – the first being mainframes, second was three tier client Server and the third one still not here. A lot of people considered the web versions of 3-tier C/S as the next generation or Software as a Service as 3rd Gen, I view those more as 2.5 Gen. The generations have a lot in similarity to the wireless industry in that you can get a lot of the promise of 3rd gen with 2.5 gen, but to do it right the underling infrastructure needs to shift to the new architectures.
I will try to take this week to discuss my views of this next generation of enterprise software. As a general framework I looked at the following four layers in the stack: Store, Compute, Messaging, Presentation; yet I am happy to hear from you guys about any layer I have missed of great importance. More importantly than the layers (which I have talked about many times in the past) I am interested in the experience resulting from the right mix of those components
With that in mind, Gen 1 was about storage on tape, computing on mainframe, communicating proprietary internal networks, consumed through green screens. We all learned how to optimize because all resources were scarce and problems we were solving were bigger than the compute platform was really capable to deal with comfortably. In Gen 2 we moved to the dominance of Hard disks abstracted through a relational data store; The computer platform got distributed and abstracted by an open operating system, TCP/IP network stack and messaging queues on top, Finally a GUI which over time got the Windows interface into the user’s muscle memory.
What is coming in the 3rd generation? Well, my first approximation shows that we are moving away from single model for each layer, and into a combination of remote and local solutions for each of the layers. In a sense we are getting more domain specific and optimizing for usage instead of for unified architecture. My current thinking revolves around the following:
- Storage moves to some combination of Network Storage (think S3) in combination with in-memory stores
- Compute will be disrupted on the server by cloud computing (which is an evolution to higher level of distributed than C/S) together with local grids as represented by appliances.
- Networks will move from packets to events – in a sense sending business objects around, with the network aware of the content and able to route them using local enterprise bus that understand the relationships between roles and resources using rules.
- Finally on the client side we will see smarter browsing – able to scale up the experience and fit the experience to the user and the content. At the same time, mobile clients with higher fidelity of experience but lower level of resources (screen and keyboard) will overtake the role of the desktop as the dominant platforms for user interaction.
A much more interesting question in my mind is what meaningful new experiences do you get from these innovations when they come together. What is E3G going to solve that the previous generations could not address. I am going to start digging into that question over the next few blogs.
In the mean time, if you think that I missed something fundamental up to here in these layers – shout!
Hi Shai,
I follow your blog closely and find your insights valuable. Regarding your last post where you mention 4 layers viz. Store, Compute, Messaging and Presentation, well you might want to consider mapping these technical layers to a business layer using a Virtual Value Chain (VVC) model (http://en.wikipedia.org/wiki/Virtual_Value_Chain). The creation of value involves a series of five events: gathering, organization, selection, synthesization, and distribution of information. This is important as information is a key ingredient to the competitive advantage of a company (and I see ERP systems as Information processing systems).
Posted by: JustForKicks | May 07, 2007 at 09:10 PM
Optimising for usage - thats very well said. Could'nt agree more.
Enterprise software has evolved significantly from the days of yore when it required "Train-the-trainer" workshops, followed by a couple of weeks of "Module training" for process practitioners. However, its hasnt reached the place where it is simply intuitive to use.
How do you take a complex business process that may be spread across multiple applications, and make it simple enough for a user to use it with very little training is a core enterprise software issue. Without addressing this issue, enterprise software does not become mainstream, but is limited in the hands of limited process practitioners.
- Sangeeta Patni
Extensio Software
Posted by: Sangeeta Patni | May 07, 2007 at 09:13 PM
Hi Shai,
Enterprise Software has indeed evolved a lot and keeps evolving as we speak.
I think you missed one layer that I would call "Collaborate" or "Coordinate".
I would put this between Compute and Messaging, although one could argue it is between Messaging and Presentation.
This layer would be responsible for coordinating the activities of different "compute" units, using "messaging" mechanisms to communicate between them.
Without this layer you cannot really compose a service based on several other services, unless you have the composition logic embedded in the "compute" units, which therefore cannot be reused.
Thoughts?
Romuald Restout
Talent Technology Corporation
Posted by: Romuald Restout | May 07, 2007 at 09:59 PM
Shai,
the "Enterprise 3rd Generation" tricked me, or perhaps that's where the clue is?
What if "Enterprise software 3rd Generation" would be a result of "Enterprise 3rd Generation" or vice versa?
Enterprise software is modeling the Enterprise, but the Enterprise as we know it already is a working model of reality designed pre IT, is it not upon time to think "remodel the model we base software models on"?
The enterprise model of today might not be as good as it can... and if so, no new models of the model will ever get better than the underlying model.
Yep, why not the vice versa of above, focus on software that could shift the enterprise model?
Sorry about all the "models", just had to :)
Posted by: sig | May 08, 2007 at 04:24 AM
Shai - you're supposed to be focusing on bigger issues, not the mundane world of software! ;-)
Seriously though, I like your thinking - but does it mean Google + Cisco + ???.
The missing piece, I think, it a new application development paradigm which needs to be a maturation of the mashup concept (which is probably a bit too "open loop" from a security and CM perspective for big-time enterprise apps), with a blurring of the line between app development and app customization. You and I have also talked in the past that there needs to be a rethinking of what "services" are and how they need to be created, cataloged, and consumed.
All in good time, I suppose.
Now about my beta test request for your first EV...
Rick
Posted by: Rick Bullotta | May 08, 2007 at 08:37 AM
Shai,
I think what you are talking about is the evolution of what I would call the 'WebOS', an agnostic platform of foundational services that form the components for web-scale applications.
You mentioned four aspects to the evolving architecture: Store, Compute, Messaging, Presentation. Yes, you are right to the extent that these are the necessary foundational components. However, I believe that atleast two 'meta' layer have to exist:
* Management - an autonomous or semi-autonomous set of agents and services that optimizes and protects the foundational services.
* Transformation - A set services that enable 3-gen application to interface in an agnostic fashion.
* Semantic Services - This may be the most far-fetched, but Tim Berners-Lee has been prophesizing this for a while. I think it may be just around the corner. A set of programmable semantic agents built around messaging that provides autonomous responses to changes in the environment. This, I expect to be a 3-gen evolution of what we call workflow systems today.
--
SS
Posted by: Sreekant | May 08, 2007 at 08:45 PM
Shai,
Wie gehts? No more German? Ach so...
I have to agree with your concepts of 3rd Gen, but I am not quite sure we ever made 2nd Gen. 2nd Gen was meant to be a move away from C/S and introduce N-tier with ALL of the N-tier distributed architecture promises.
The N-tier model that we have today presupposes that the centric application server has full control of all transactions because we never fulfilled the original requirements of distribution of the original N-tier model. In order to move to 2.0, or in your words 3.0, we need distributed services to be aware of each other. These services not only have to be transparent to the end user, but more importantly, to the developer.
An N-tier model must deliver on the promises of a distributed model to be considered fully mature. I don't believe we fully achieved this with our current application architecture. We stopped short due to network and computing constraints and thus we are currently stuck at a weak N-tier model or a 1.5 generation model that requires a single Application Server to control the transactions between the tiers.
-Todd
Posted by: Todd Paterson | May 08, 2007 at 11:01 PM
I was picturing your layers on a slide with four blocks, one on top the other, Store, Compute, Messaging, Presentation. There is one block that to me is just crying out to be drawn vertically alongside: People.
Gen1 was about IT holding the keys to their own specialized and proprietary domain. The platform was the application and very much out of the reach of mere mortals.
Gen2 moved the applications out into business as the platform was abstracted away - there were different domains for IT and business which often led to conflict.
Gen2.5 started moving data manipulation more into the hands of business - users can leverage their knowledge of existing applications (outlook, web browsers, Excel, Java) to extend their power without disrupting the platform.
Gen3 moves even more power and control into the hands of end users. Distributed computing is enabled and complete business objects live on the network. Tools like Visual Composer evolve and live up to their full potential allowing the creation of ad-hoc applications without compromising the core platform integrity. The user interface is split completely from the underlying rules and data. Beside new user interfaces I can also imagine the development of specialized enterprise bots that interact autonomously (or semi-autonomously) with the system.
Posted by: armando | May 09, 2007 at 03:10 AM
Hi Shai,
You are right in e3g, but I think we have to see possible innvoations in the existing layers, e.g. researchers at MIT, CMU are already working on Internet 2, internet the way we know and use, was result of R & D people wanting to share data over private network. The world has definitely moved since those days, so I feel that next generation network will play a major role in the e3g architecture ( which you have also covered in your vision...)
I hope the point is clear...
Akash Mavle
Solversa
Posted by: Akash Mavle | May 09, 2007 at 08:29 AM
Hello Shai -
I have a perspective for you to consider. After I interviewed Van Jacobson at Xerox Parc on Content-Centric Networking for the Enterprise. This is a 'rethink' of how we make the experience to the user significantly easier to bridge the walled gardens we create with so many VPN environments. securely sharing information (content) is difficult with the current TCP protocols. You might want to listen to the interview and contact Mdr. Jacobson since he's up your way.
http://www.enterpriseleadership.org/listen/podcast-jacobson/
All the best to you
Tom Parish
host EnterpriseLeadership.org
host Talking Portraits show at ITConversations
Posted by: Tom | May 10, 2007 at 09:55 AM
I think there must also be a re-think in how code is developed. The idea that critical compute logic - business logic - can be owned by programmers (http://www.edmblog.com/weblog/2006/11/the_problem_wit.html) has to go away and cannot be replaced by everyone programming (http://www.edmblog.com/weblog/2007/01/does_everyone_e.html). The business decisions being automated must be managed by the business and IT collaboratively(http://www.edmblog.com/weblog/2006/03/business_and_it.html). Decisions are different enough from other kinds of code to require a new approach (http://www.ebizq.net/blogs/decision_management/2006/04/why_manage_decisions_1.php).
Keep up the thinking!
JT
Posted by: James Taylor | May 10, 2007 at 01:02 PM
Whether it's a formal layer or whether it might alternatively be called a veneer, a metadata or "data identity managment" layer needs to exist so that each of the other layers may understand more about what to do with a given piece of information. Some might tie these pieces together with business logic/rules, but a key piece that few organizations have figured out - because it's so difficult and steeped in silos and individual areas of ownership - is the metadata, semantic layer, whatever that identifies info such that it can be addressed by other capabilities. You could stretch this and twist it a bit further to add a "policy layer" that also knows what to do with a given piece of information, whether it's on its way down the stack to be stored/archived or whether it's on its way up to be viewed, distributed or consumed.
This magic policy layer is also siloed today, and pretty much every vendor has its own take on policy. Without an arbiter - and without the ability for any given piece of data to be identified by this policy layer as to its inherent value or risk, it can't be disposed of/put to use optimally. Optimally here means in the appropriate context or with respect to the appropriate rules of disposition for that piece of info.
Posted by: David Yockelson | May 11, 2007 at 05:43 AM
I think one part of 3Gen will be to replace the "client" with "customer", in the proper sense of those terms. It fits with the instores, clouds and events. Couldn't agree more with you.
Thanks.
Posted by: Padmanabha Rao | May 18, 2007 at 12:01 PM