Tag Archives: open standards

API / Microservices Business Innovation and Solution Enablement Strategies

Or in other words what does a Post-Modern ERP strategy look like?

The Hunter Becomes the Hunted Image at 180717

Blog by Tim Main – IBM Information Management and ERP – Technical Director

17th July 2017

Executive Summary – The Question?

A number of months ago I was asked by one of my experience and senior colleagues in the IBM SAP Implementation and solutions technical community the question what might “A Post-Modern ERP Solution Landscape” look like?

This followed a similar prior question from the CIO of a significant European Pharmaceutical  concern and the recent DSAG SAP CIO Investment Study of 269 German speaking CIO’s from November 2016-January 2017.

Where ~ 50-60+ % of these CIO’s essentially identified increased strategic IT investments in Digital Innovation as a key priority, whilst up to 50% did not currently consider SAP S/4 as an alternative to their existing often customized and broadly deployed SAP Business Suite / ERP deployments.

Consequently linking back to my prior blogs in this area, I decided to explore what a “Post -Modern ERP Landscape” may look like leveraging an open API / Microservices architectural integration construct that seeks to “Innovate with new + integrate + leverage existing” ERP Systems of Record applications scenario.

It maybe suggested by some ERP vendors that a migration, upgrade and/or remediation of prior customised, business process aligned SAP Business Suite functionality onto SAP S/4 Enterprise Management is linked as a pre-requisite to the ability of an Enterprise to Digitally Innovate.

Recently Philip Howard, Information Management, Research Director at Bloor Research has published a paper looking at the Myths of SAP HANA which can be found here.

From my point of view, and likely more importantly from the view of a number of Enterprise Client Chief Enterprise / IT Strategy Architects I have spoken to, we simply don’t see this firm or direct linkage, in fact the reverse seems to be true in an Open Source / Open Enabled Digital Innovation world.

Where the focus is correctly on enabling the integration of new front office Digital Innovation IT solutions (enhanced Systems of Engagement, Systems of Insight, Systems of Innovation) vs remediation of prior Systems of Record / ERP solution investments.

If we accept this fundamental strategic assumption, we can then go on to consider a number of the “layers of the cake” in terms of Business Innovation into IT technology strategy and prerequisite capabilities.

The Resulting Strategy – High Level Strategy – May look something like?

An Open Business Domain : IT Transformation Viewpoint v2 180717

This also links back to my prior analysis of “Factory IT” and Innovation IT” following a Harvard Business Review (HBR) – Business into IT Strategy Review Paper in 2008/9, where this is now more commonly referred to by IT analysts like Gartner and/or Forrester as “Bi-Modal or Dual Speed IT”.

Further details on “The Layers of the Cake”

Open Business : IT Transformation View - Details v1 140717

From my point of view one of the most important and strategic aspects of an effectively implemented API / Microservices strategy is to enable, deploy and govern a layer that acts as a layer of graphene or graphite, gearing and lubrication between the rapid pace of innovation and change demanded by the business of Innovation IT.

Whilst pragmatically recognizing that Factory IT needs to operate at very different speeds from a change and release perspective whilst still communicating and passing data between these two layers effectively.

A “Back to the Future” IBM SOA Solutions for SAP from 2008

Whilst researching in preparation for this blog I was repeatedly drawn back to a prior IBM SAP SOA Client White paper at Viessmann that describes SOA (Services Orientated Architecture) solution enablement to complement the Clients prior and significant SAP Business Suite / ERP investments.

This paper essentially described the principles of a flexible IBM SOA enabled front office application integration/s strategy for client and channel facing and line of business applications, helping Viessmann to deliver increased business flexibility and enhanced customer service and productivity to support the needs of a growing business.

Being a firm believer in “Back to the Future” Scenario’s within the IT Industry, I was very naturally happy to look back for a proof point to then look forward again.
The summary paper describing this project can be found at

In its simplest terms this case seemed to cover a number of the key aspects for a client to consider in a “Post Modern-ERP Scenario”.

However this said, I’m often then challenged by existing EAI (Enterprise Application Integration) teams on the question “but we already have a deployed and working ESB, why do we need an “Inner Ring / Outer Ring” hybrid API / ESB architecture” which I’ve attempted to explain in the following two diagrams.

The fist diagram comes from an excellent “Integration Throughout and Beyond the Enterprise” IBM Redbook that can be found here.

Figure 1.1 and Figure 1.2 in particular nice summarizes the differences in the prior SOA focus and the SOA + API Economy focus.

Figure 1.2 from API Redbook

Additionally, in the following diagram after reviewing and consuming the API for Dummies Wiley book that can be found here, I’ve attempted to summarize the differences and positioning of this dual ring API / Microservices and ESB / EAI enabled strategy.

High Level Strategy - API Microservices Enablement at 180717

Then I have pulled together a couple of diagrams (on the basis a picture is worth a 1,000 words) that consider the key factors from both a business and technical point of view on the positioning of API enabled Microservices vs ESB enabled Enterprise Application Integration (EAI) as follows, whilst a little busy they are both self-explanatory.

The Wikipedia description of Microservices seems to nicely summarise the combination of loosely coupled, fine grained services to enable agile and flexible development initiatives combined with “re-factoring” and/or re-facing of existing systems into the post-modern ERP world.

“In a Microservices architecture, services should be fine-grained and the protocols should be lightweight. The benefit of decomposing an application into different smaller services is that it improves modularity and makes the application easier to understand, develop and test. It also parallelizes development by enabling small autonomous teams to develop, deploy and scale their respective services independently.[1] It also allows the architecture of an individual service to emerge through continuous refactoring.[2]Microservices-based architectures enable continuous delivery and deployment.”

Recently the University of Manchester has been doing some very innovative work looking at the layered properties of Graphene for water filtration this work as described and summarised in a recent BBC Science & Environment item on the 3rd April 2017.

Graphene Image from the BBC web site - 3rd April 2017

For me this provided a nice analogy of what we are seeking to securely do with an API / Microservices IT architecture where we have “Inner / Outer Ring” layers of the business and application integration cake that enable loosely coupled clusters of fine grained “SOA” like services to work.

Where solutions like IBM’s API Connect provide proven secure  “appliance based” strategies for the outer ring whilst integrating and safely filtering / passing fine grade API’s enabled data to and from with the clients existing inner ring ESB.

This diagram below attempts to summarize the differences in nature between an API  / Microservices appliance and a typical ESB for Enterprise Application Integration (EAI).

Integration Topologies Inner Outer Ring v3 170717

And from a business into technology point of view in terms of grouping, use case alignment.

Example Mapping Functional Capabilities at 180717

Then we simply need to layer in capabilities in the new “Two Triangles” worlds of SQL schema before and SQL schema after “Big and little data” and we have a foundation from which to build.

Supporting Data Management and Information Management Strategies.

In my prior blogs I’ve described the supporting “Two Triangles” (SQL Schema before and SQL Schema after) data worlds that needs to be developed in parallel with a viable API / Microservices strategy, this is very important to avoid the API / Microservices enabled business solutions becoming “islands of information” that are isolated from each other.

Where a critical objective of an API / Microservices economy is to leverage information insights for strategic and competitive advantage including both prescriptive and predictive analytics in addition to the more common descriptive analytics.

At the risk of a slightly longer Blog item, this architectural approach is summarized and described below, where the “Insight into Action” activities maybe both API / Microservices enabled and linked to aligned cognitive / intelligent business process optimization tools and IT capabilities.

The Two Triangles Information Strategy v2 170717

Inhibitors and Enablers

In this diagram, I’ve attempted to summarize the key enablers and inhibitors for a successful API / Microservices deployment strategy:

Key inhibitors and enablers for an API Strategy 170717

Critical Success Factors

Understanding that ultimately a successful API / Microservices strategy starts with the business digital innovation agenda and strategy and then flows down into the enabling IT capabilities, whilst initially bottom up API / Microservices projects are a way to start small and scale fast, ultimately it will require a top level down strategic investment led strategy.

Critical Capabilities - Executive IT Architectural PoV at 180717

Also I would refer readers to an excellent IBM Institute of Business Value study that looks at Innovation in an API Economy which can be found here.

Open Source and Standards driven enablement and business process optimisation

In my view for an API / Microservices strategy and economy to succeed it requires a clear and long term commitment to Open Source Solutions and the definition of Open / Published API / SOA messaging formats and standards.

IBM has a very significant and clear track record in this area, including in recent times the Open Source enablement and contributions made by NODE-Red in the IoT (Internet of Things) area, which was the subject of a prior blog item.

Conversely any ERP vendors who attempt to impose aggressive license terms and conditions that essentially prevent the enablement of a successful API / Microservices economy through the application of “indirect access” license terms and conditions will likely become increasingly isolated islands in time.

Which takes back in a full circle to the beginning of this blog, what does a “Post Modern ERP Application” look like in a world where the ability to Digitally innovate and successfully integrate traditional Systems of Record / ERP systems with innovative Systems of Innovation, Insight and Engagement becomes and critical “business survival and differentiating strategic IT capability”

PS A recent example of front office Business Process Optimization, Automation and Integration rat Carlsberg can be found here. and the enclosed youtube video link.

Sources of further information that are referenced or were researched for this blog include:

Understanding there is a very broad and deep pool of information sources in this area, consequently my principle challenge for this Blog was what to leave out vs not what to include.

For example, I left out a pool of material on Client SOA / API maturity capability analysis and step wise development that was very interesting and critical for most clients, in addition to IBM’s Data First Method, whilst understanding Rome was not built in a day.

IBM’s API Connect Overview can be found here:

https://www.ibm.com/support/knowledgecenter/en/SSMNED_5.0.0/com.ibm.apic.overview.doc/api_management_overview.html

..and further technical details here:
IBM Redbook – Getting Started with IBM API Connect: Concepts and Architecture Guide

http://www.redbooks.ibm.com/abstracts/redp5349.html?Open

APIs for Dummies – Claus T Jenson

https://public.dhe.ibm.com/common/ssi/ecm/ws/en/wsm14025usen/WSM14025USEN.PDF

Plus, a recent demonstration of integrating simulated data from a back-office ERP system with Weather data to dynamically re-route deliveries to from ACME Co to Retail pharmacies and distributors leveraging “Strong Loop” capabilities:

The Evolution of the API Economy – IBM Institute of Business Value

IBM Redbook – Integration Throughout and Beyond the Enterprise

https://www.redbooks.ibm.com/Redbooks.nsf/RedbookAbstracts/sg248188.html?Open

The prior Viessmann IBM White Paper re SAP and IBM SOA Solution Enablement

http://www-05.ibm.com/de/solutions/references/download/SPC03045DEEN-Viessmann_Final_EN.pdf

White Paper – IBM SOA Foundation: providing what you need to get started with SOA, 2005.

ftp://ftp.software.ibm.com/software/soa/pdf/SOA_g224-7540-00_WP_final.pdf

Wikipedia Entry re Microservices

https://en.wikipedia.org/wiki/Microservices

Finally as an example IBM Watson IoT architectural point of view (pass the curser over the various ABB’s) in the landscape which combines integration and information building blocks.

https://www.ibm.com/devops/method/content/architecture/iotArchitecture

The opinions within this blog are the authors, they do not represent a formal IBM corporate point of view, copyrights are respected and/or sources referenced.

Advertisements

Innovation that matters – Node-RED

IoT Innovation that matters – Node-RED

node-red-simple-v2
With 33+ years of Enterprise IT solutions and architectural experience, it’s not often that I come across innovations, ideas and solutions that are truly new and/or transformational.

However I have to say a recent IBM OpenWorks Node-RED webinar and IoT solution / device integration webinar really made me sit up and take note, the event replay can be found here.

node-red-simple

In the webinar the team of presenters and founding Node-RED developers from IBM’s  Hursley Development labs, Nick O’Leary and Dave Conway-Jones (IBM Senior Inventor), in conjunction with Dr Mike Blackstock and Dr Rodger Lea from Sense Tecnic Systems, Inc (and the University of British Columbia) really knocked it out of the park for me.

It looks like an excellent combination of a logical, simple, effective idea and combined with ease of use and flow execution. It combines a set of API driven device integrations and data inputs/outputs into “flows”, leveraging a flow based programming model, that in turn was originally defined by J. Paul Morrison at IBM in the early 1970’s.

Node-RED has now been adopted by Linux Foundation JavaScript js.foundation

This model essentially combines a network of asynchronous processes communicating by means of streams of structured data chunks or elements, where each process is in effect “a black box”. Each step does not not need to know what went before, or indeed what comes after: it just acts on the data it receives and passes the result on.

In this respect it is similar in concept to one of the founding principles of IBM’s MQSeries, deliver once reliably and only once, and/or subsequently MQTT as a lightweight pub / sub messaging protocol that sits behind Facebook chat etc.

Remembering childhood games of “pass the parcel” where participants could leave (or indeed re-join) the game, whilst executing their specific step until the music stopped.

node-red-flow-overview

The Node-RED solution (which has now been adopted and is being further developed by the Node.JS open source community) looks like a very elegant, cost effective and simple solution to complex IoT / Manufacturing 4.0 device and data integration requirements.

To me, I really enjoyed the IBM OpenWORKS talk and would commend it to IT Strategy teams and architects that are engaged in Front Office / IoT Digital Innovation solution deployments and platform strategy definitions.

You might also call this Node-RED solution the “yin” to a blockchain / IBM Hyperledger “yang”, where both are now founded on significant Open Source principles whilst one does not mind what went before and after, and the other executes principles of immutability in a trusted network of asset flows (fiscal or physical).

the-yin-and-yang-bigger-picture
Also this week I really enjoyed the annual British Computing, Alan Turing lecture by Dr. Guru Banavar is vice president and chief science officer for cognitive computing at IBM.

Guru is responsible for advancing the next generation of cognitive technologies and solutions with IBM’s global scientific ecosystem, including academia, government agencies and other partners, he leads the team responsible for creating new AI technologies and systems in the family of IBM Watson, that are designed to augment (and not replace) human expertise in a broad range of industries.

For me it was an interesting and logical co-incidence that Guru was previously, the chief technology officer (CTO) for IBM’s Smarter Cities initiative, where Node-RED also has very logical use cases for building climate control systems and solutions, described by Dr Mike Lea and Dr Roger Blackstok in the second part of the Node-Red update.

Guru Banavar designed and implemented big data and analytics systems to help make cities, such as Rio de janeiro and New York, more livable and sustainable.

When I logically put together Node-RED IoT / Manufacturing 4.0 / IBM Watson IoT enabled innovation with IBM’s Bluemix Dev/Ops platform and the IBM Watson Cognitive, Analytical and HyperLedger capabilities in a secure Hybrid Cloud, API / Microservices enabled “Choice A” scenario the opportunity for Open Source enabled digital innovation seems to be truly significant!

The opinions within this blog are the authors, they do not represent a formal IBM corporate point of view.

Developing flexible, business aligned IT innovation and capability delivery strategies

After my prior LinkedIn item on when does SAP S/4 HANA make sense for your business?

I received a question through a colleague in respect of a large European client who was effectively defining a Choice A over Choice B forward IT strategy.

it-innovation-choices-level-1-02022017

This prompted me I to sit down and put some further thought into a IT Executive view of a business into IT strategy model and approach that could result after an Enterprise client makes Choice A.

A Business Domain / IT Transformation Viewpoint

Getting straight to the point, as a direct consequence, I then sketched out the following high level architectural thinking and strategic business into IT building block based approach (having further researched various recent IBM C Suite and IBM Institute of Business Value (IBV) studies) that places an effective API and Enterprise Service Bus “inner / outer ring” strategy as the key enabling capability.

high-level-business-into-it-strategy-300117
It was also interesting to read a recently (31/01/17) published summary of a study by the DSAG (German Speaking SAP User Group) which in addition to discussing various S/4 HANA adoption rates amongst the surveyed SAP DACH clients.

This study highlighted plans to significantly increase forward IT investments and focus in front office “digital transformation and business / IT enabled innovation projects” within the surveyed German and Swiss speaking / based companies (similar to Panalpina) leading in this area, with up to 60 and 70% respectively of funding being targeted at these critical area’s and “competitive advantage” or simple “stay in business” IT transformation and investment requirements vs disruptive adjacent industry or new physical “asset lite” digital disruptive competitors (Netflix vs Blockbuster retail outlets comes to mind) .

It is possible to say that a number of industries including global logistics companies naturally already have complex extended digital supply chains which are facilitated by standards based messaging, API’s, data exchange with supporting back office EAI (Enterprise Application Integration) platforms and web enabled consumer and partner .

After creating this high level model and approach I naturally started to think further the critical business into IT capabilities that are required to ensure success when implementing a business aligned IT innovation strategy of this type.

When I net these out, to me they seemed to distill into six or seven strategic IT capabilities and imperatives in combination with the profile of the individual business appetite from a forward risk / reward and rate of IT innovation and management / adoption of change point of view.

These included the business aligned development, governance and practical implementation of:

  1.  An effective API development, brand, portfolio management and delivery strategy
  2.  An effective data governance and management strategy to pool, integrate and actively manage data for trusted, timely and accurate business insight
  3. An existing, defined and working ESB (Enterprise Service Bus) application messaging platform or platform/s (I refer to this as an “inner ring / outer ring” ESB / EAI strategy)
  4. The appropriate and targeted selection of buy (& Integrate) vs build (Dev/Ops) and run application capabilities
  5. Aligned Business and IT Executive sponsorship, funding and organization / cultural factors
  6. Structured, thoughtful analyze to select the most appropriate IT capability building blocks
  7. A plan to explore, target and integrate cognitive computing, ML / AI capabilities

Crucial also is timing, as various new and emerging technologies typically traverse the Gartner “technology hype” curves (or Forrester Waves) at different speeds, with specific vendors, technologies and/or platform/s emerging to becoming a or the “de-facto” standard or dominant provider in a particular function or area.

Within this context it’s clear that some fundamental and basic “table stakes” still apply including:

  1. A real, demonstrable, sustained commitment to Open Source offerings and capabilities
  2. The selection and integration of at scale, viable “top right” quadrant platforms, products and/or technology partners
  3. Appropriate Business into IT funding to phase delivery – Start small, prove, then scale fast
  4. Understanding when to buy vs build – The Factory IT vs Competitive Advantage IT question
  5. Understanding that multi source, highly cost optimized, outsourced IT strategies are relatively unlikely to provide a firm foundation for the agile delivery of new business into IT capabilities – as Business into IT Driven “Digital Innovation Requirements” become ever more critical

Which in turn relates towards the phased evolution vs revolution question that is described and nicely summarized in a GA Moore technology model below.

ga-moore-technology-adoption-model-300117
Where often within individual global enterprises various Business into IT delivery programmes will sit in different segments under the “cross the chasm” curve.

In a prior Imperial College Business Innovation course it was clear that the most successful and effective business innovation strategies and platforms (including the Apple’s iPhone), were in the majority of successful cases, actually combining proven prior individual technology components and eco system building blocks in new innovative ways.

It was the innovative new combination of these proven capabilities and technology building blocks often within new value based networks that creates the greatest and/or most disruptive business value, often not brand new or immature technology.

It was also interesting to observe a recent joint Schaeffler Group / IBM Watson IoT / Manufacturing 4.0 partnership announcement and youtube video that is grounded on a number of these principles as described below:

schaeffler-strategy-external

In addition to a recent LinkedIn CIO / Data Management forum item that nicely described effective Data Management and IoT strategies as the “King and Queen” partners of aligned IT Innovation capabilities in the complex game of chess that is successfully implementing viable, long term IT strategies.

If these are the King and Queen I’d also then say that Hyperledger and blockchain represent the Castles in chess terms, enabling swift directed movement combined with protection and security.

Additionally in my view as described in a short you tube video about IBM’s Hyperledger blockchain pilot system within IBM Global Financing (IBM IGF process $44 Billion Dollars of transactions, within a network of 4,000 partners, suppliers, shippers, banks).

Where the implemented open source based IBM Hyperledger solution provides an “individual client ledger” neutral, secure, immutable, auditable digital asset / document and transaction supply chain without seeking to force change the participating partners back offices platforms which is costly, risky and typically has extended cycle / lead times vs speed to value.

Approximately 10 years ago a number my IBM colleagues in our Consulting, Hosting and Global Services teams invested significant time and effort in a structured review of large scale complex IBM / Client project deliveries both successful (and a few unsuccessful) to help better inform future joint projects and joint IBM / Client success.

The output of that study is as valid now, if not more so now than before (in our Hybrid Cloud, Cognitive world), in that it logically identified and confirmed what we all know to be true, but is unfortunately often ignored or lost in the heat of an early project life cycle.

In particular these structured approaches become even more important as many large, medium and small Enterprise clients seek to successfully deploy and manage relatively complex “Hybrid Cloud” scenario’s.

hbr-hybrid-cloud-factory-it-vs-innovation-it-300117

The success of any significant IT initiative crucially depends upon the initial business aligned requirements definition and closely managing and defining the interfaces and hand offs between the different partners and functions that are described in a 10 box IT operating and innovation model.

With the 1st and most crucial box being the initial terms of reference and requirements definition box prior to developing a 9 box “design, build and run” model in 3 layers:

  1. The business transformation requirements, value into the application delivery layer
  2. The business application, integration and data management layer
  3. The IT Infrastructure, platform and IT service delivery layer/s

The success of this model and approach is then defined by the success (or otherwise) of carefully defining and managing the interfaces between the 9 boxes in people, culture, technology, funding, capability teamwork and strategic terms as follows:

10-box-model-v1a-300117

One of my Client IT Architect colleagues working in the Retail and Consumer products industry also recently highlighted that it’s never been more important to manage these interfaces effectively to avoid the unwelcome emergence of the “IT to IT Gaps” that will inhibit successful delivery, in combination with the critical success factor of selected and assembling proven building blocks in new innovative ways that is at the heart of the most successful Business into IT innovations:

it-innovation-requirements-flow-300117

Where the basis rule applies more often than not, assembling proven capabilities and building blocks (using a Lego like analogy) will typically yield more predictable and effective outcomes.

enterprise-architecture-methods-300117

I hope this item is helpful, in highlighting the requirements and prerequisites for successful “Choice A vs Choice B” business into IT innovation delivery.

IBM provides a combination of proven scalable, virtualized building blocks for Enterprise scale SAP Hybrid or Private Cloud platform delivery including DB2 v11.1 LUW, IBM POWER8, AIX, PowerVM, Linux, IBM System z, DB2 and/or Linux One, System I with DB2.

Disclaimer: The views expressed by the author in this blog reflect 33 years of experience in Enterprise IT and ERP / Application platform delivery are my own and do not represent formal IBM views or strategies. Vendor trademarks are respected

 

 

 

 

 

Will Open Source Enabled Big Data, IoT / API Enabled Innovation prevail – YES or No?

In this section (4) we consider the question

– Will Open Source Enabled Big Data, IoT / API Enabled Innovation prevail – YES or No?

It is also clear that unless IT Functions embrace and lead in an API / IoT enabled economy we will continue to see the development of “Shadow IT” capabilities that are closely aligned to and embedded within the individual lines of business (sales, marketing, supply chain, manufacturing, distribution, multi-channel, partner enablement).

Indeed I believe we will continue to observe a switch from Business to Consumer (B2C) towards Business to Individual (B2I) insight based targeted enablement based on location, weather, preference, event, location insights (which follows IBM’s acquisition of the Weather Company) in addition to prior IBM alliances with Twitter, Apple and/or more recently Cisco in the IoT / Edge and Data Analytics area .

Indeed IBM already delivers solutions in this area with our Metro Pulse solution for Consumer Products Industry clients where multiple sources of un or semi structured Big Data (SQL Schema After) and/or Little Data (SQL Schema Before) are seamlessly combined with location, weather, preference, local event, historical POS data and promotional data etc to increase sales and product availability in “Metro” city based locations like London, New York or Singapore.
Also a high rate of innovation (and change) is currently being observed in the Big Data platforms and analytics solutions area where it seems that the majority of the Enterprise IT Architects and Clients I’ve spoken to are firmly committed to Open Source aligned Big Data solutions and platform choices. This then naturally raises the following question

Will Open Source aligned Big Data solutions eventually prevail?

In my view the answer is a 100% YES, although I also believe that a balance between open source driven innovation and large enterprise scale IT non functional requirements is required as summarized below:

Open Innovation

Indeed in the area of Proprietary vs Open Systems (at one time these used to be defined as Unix based Client / Server systems vs the IBM Mainframe), IBM previously tried a relatively closed and proprietaty approach when the IT market was rapidly transitioning towards Unix or “Open” distributed client / server platforms in the early 1990’s.
IBM subsequently / consequently suffered a near death experience in business terms as prior continued M/Frame MIPS platform capacity growth rapidly switched towards these alternative Distributed / Client Server platforms. Indeed SAP delivered SAP R/3 vs R/2 over the Mainframe with DB2 before to align with this “Open Systems” choice and market trend.

Although it is also true to say in more recent times IBM M/Frame MIPS capacity growth (combined with Open platform M/Frame Linux enablement) continues at a pace, often for mission critical systems of record / big batch scenarios.

Something rather similar happened in the PC market where IBM developed a technically superior but incompatible IBM PS/2 MCA (Micro Channel Architecture) as a follow on to the original IBM PC and/or PC AT IO adapter architecture.

Just as we technically turned right the rest of the market turned left with an ISA (Industry Standard Architecture) PC Input / Output (IO) adapter and bus strategy the rest is history as IBM’s PC Company went from having a largely dominant “IBM PC” market share to a significantly smaller share over time, is the same thing happening now in the core ERP / Systems of Record market?

As a direct result and subsequently IBM’s commitment and contribution to Open Source driven projects and innovation has been second to none amongst the major IT vendors, in summary IBM essentially previously learnt a very hard business into IT lesson in IT innovation and industry change terms.

This commitment includes significant investments and technical alignment to the following:

  • The Apache Software Foundation (1999) subsequently Eclipse (2001)
  • Linux (2007), OpenStack (2012), Cloud Foundry (2014)
  • jS (2014), Docker and the very significant Apache SPARK in-memory analytics operation system (2015) investment
  • In addition to the more recent innovative Blockchain based Hyperledger project (2016).

These commitments are in addition to the ODPi (Open Data Platform) Hadoop initiative are now both pervasive and very significant within IBM, indeed IBM recently published a paper that summarises this commitment and the resulting rates of Open Source driven innovation that in the longer term, in the view of the author will always eventually prevail over proprietary aligned alternatives no matter how big a single vendor or aligned partner eco system commitment.

Hence in my view, It’s not really a case of if, simply a case of when Open Source based innovation prevails.

Indeed in support of this viewpoint, Vinnie Mirchandani (in SAP Nation v1.0) mentions the success and growth of the Cloud Integrator Appirio with BoB / SaaS integration solutions and a large TopCoder community, in addition the rapid growth IBM is experiencing in the Bluemix and/or API Connect area’s.

Of course this Open Source commitment does not mean clients will not require a prior trusted solution partners to help them to safely bridge between their existing system of record and planned front office API enabled strategic IT platform investments, either Public, Hybrid Cloud or indeed prior Private Cloud / On-Premise often for mission and business critical data protection and/or privacy / IP reasons – It all starts with the data !.

The above mentioned paper can be found here, it nicely summarises the evolution of various Open Source platforms over time.

https://www.ibm.com/developerworks/cloud/library/cl-open-architecture-update/

More recently one of the potentially most significant and innovative Open Source projects is the rapidly emerging Blockchain “Hyperledger” distributed ledger project that will in my view be truly transformative for many clients and industries.

Indeed I’d also recommend a rather detailed report published by the UK Government Office for Science, Chief Scientific Advisor, Mark Walport in December 2015 called :

Distributed Ledger Technology: beyond block chain, which can be found at:

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/492972/gs-16-1-distributed-ledger-technology.pdf

Commencing initially with the Financial Services industry but then likely rapidly extending into other industries like Consumer Products and/or Discreet Manufacturing where complex extended and distributed supply chains and resulting financial transaction flows and ledger entries are the norm.

I’d also recommend the following short youtube video that describes the future impact of the this project in addition to this item on the Financial Times ft.com as follows:

https://next.ft.com/content/eb1f8256-7b4b-11e5-a1fe-567b37f80b64

https://www.youtube.com/watch?v=hMUNfxcmyEE

Having covered some of the strategic IT investment choices in the above lets now dive back into the details of some of the “hype” and largely commercially driven pressure to migrate to SAP S/4 HANA and/or HANA OS/DB migrations (vs prior Oracle, DB2, MS SQL etc SAP AnyDB platform choices)

Now we move onto the final section in this series of blog entrances – In-memory marketing hype vs reality, section 5.

Disclaimer – This blog represents the authors own views vs a formal IBM point of view

The views expressed in this blog are the authors and do not represent a formal IBM point of view.

They do represent an aggregate of many years (20+) of successful ERP / SAP Platform deployment and IT strategy development experience that is supplemented with many hours of reading, respective DB2 and/or SAP HANA roadmap materials and presentations at various user conferences and/or user groups, in addition to carefully reading input from a range of respected industry / database analyst sources (these sources are respected and quoted).