Category Archives: ERP Choices and Costs

Avoiding the Vortex: Successful Postmodern ERP & Information Management Strategies


Or in other words leveraging both:
Digital Innovation and greater Business Insights = i Squared

Blog by Tim Main – IBM Analytics – Information Management and ERP – Technical Director

24th January 2018

Executive Summary – The challenge facing CxO’s, CIO’s, CTO’s and Chief IT Architects

It is clear businesses across multiple industries are having to manage unprecedented rates of technology change, whilst concurrently having to Innovate to both manage and address new “Digital Platform” challengers and/or adjacent industry competitors.

Where prior physical asset centric “Leverage our global scale and reach” business models are being challenged by asset light digital partner eco systems.

Businesses need to effectively balance global scale and reach with local flexibility, agility whilst still being response to Insight driven innovation and adaptability, in the past this was described this as the “GLOCAL challenge”.

As both senior business leaders (CxO’s) and aligned senior IT Executives (CIO’s, CTO’s, CDO’s, Chief Architects, etc) have to both make and balance complex strategic Digital Innovation and IT Investment priorities, choices and trade off decisions.

Within this context it’s relatively easy to be drawn back towards what would have been considered to be prior safe “major ERP platform” eco system aligned solution investments, we are a SAP shop or an Oracle shop etc.

However, in the practical reality a postmodern ERP and Information enabled age, these prior safe choices can rapidly turn into a storm / sink hole “vortex of IT investment/s”.

Where unless carefully managed they then carry a significant risk diverting finite strategic IT investments away from front office Insight driven digital innovation.

Where these are firmly targeted at customer and channel enabled relationship and revenue growth opportunities that typically generate a faster real rates of return / ROI with a significantly lower “start small prove (or fail fast) and scale fast” risk profile.

If we accept the principle that “Factory IT and “Innovation IT” (Some call this Bimodal IT) naturally operate at very different rates of change, often with different, disparate supporting technology stacks  and architectural building blocks (ABB’s).

Then as a direct consequence Enterprise Clients need to develop both strong and viable Enterprise Application Integration (EAI), DevOps and Information Architecture and Information Management architectures (IA & IM) in parallel.

Let’s call this Dev/Ops plus the “other eAI” and IA + IM for a moment, or indeed the eAI and IA + IM “Twins”  plus the AI/ML Ladder as described below.

We then end up with a Business into IT Strategy construct that could be summarized as follows:

Screen Shot 2018-01-25 at 16.04.00
Figure 1 – Essential Capabilities in a Postmodern ERP environment.

Then within this context we have a framework to prioritize both capability development in critical Digital Innovation (or/or disruption) and information driven insight area’s to help respond challenging business into IT Innovation requirements and that operate at viable rates of change.

Some traditional or hybrid postmodern ERP vendors and their implementation partners (CSI’s) will typically strongly recommend that a re-implementation and simplification of typically deeply customized core ERP (transactional) application template and MRPII planning systems is a prerequisite for effective Digital Innovation in the front office, in addition to relatively tight coupling of “Edge ERP” / SaaS portfolios with the prior Core ERP / SAP NetWeaver functional portfolio’s.

Personally I do not subscribe to this point of view for the following reasons:

  • Whilst essential, typical ERP processes are no longer differentiating capabilities vs simply being “must have, must work, table stakes” whilst needing to be cost efficient and optimized where practical relative to the required effort and/or ROI, not re-engineering one set of ledger structures for AN other.

    A number of Clients I talk to are actually deciding to “Digitally Innovate” around the edge of prior Core ERP Investments in a hub and spoke or 2 tier ERP modelIndeed the DSAG Investment Survey from late 2016 into 2017 was a clear pointer in this respect plus the latest DSAG 2018 Survey that can be found here.

  • We have observed the strong emergence (& acquisition and/or integration) of SaaS based “Edge” complementary ERP providers leveraging SaaS / Cloud native delivery models, examples include:
    • Workday, (HR) NetSuite (ERP), (CRM+), Anaplan all come to mind..
  • Even for the “Mega ERP” vendors like SAP and/or Oracle, have typically acquired “bolt on” complementary SaaS solutions (for example SuccessFactors, Concur, Fieldglass, Ariba) which are built out on disparate information management and platform architectures.Hence these also depend on effective API/Microservices integration and flexible Information Integration, Management and Governance processes and flows back to a given ERP core.
  • Additionally, when we then examine typical new Dev/Ops and/or Open Sourced enabled front office Digital Innovation and Insight driven workloads.These tend towards being clearly back office “ERP System & Ledger Neutral” and open standards enabled as the majority need to seamlessly integrate with extended eco systems of partners, channels, supply chains. In this case a homogenious ERP+ platform strategy actually then becomes a Digital Innovation inhibitor vs enabler.
    Where one thing is clear in ERP platform terms, these will simply not be uniform or indeed dominated by a single ERP provider; examples include IoT, Blockchain, Extended Collaborative Logistical and Manufacturing 4.0 Supply Chains, Big Data, API Enabled Weather Data etc
  • Typically implementing Modern, Consumable User Interfaces (UI’s) like SAP Fiori vs the prior very network efficient SAP GUI are and/or should not be dependent vs independent of application logic or data platform choices (a layered SAP NetWeaver and/or fundamental SOA design & separation of duties principle).

A number of these topics are discussed in a Gartner Paper (G00311163) “Schrodinger’s Cat: How ERP is both Dead and Alive” Published on the 28th of June 2016.

Which I would recommend as further reading material, also there are a number of papers from Forrester, SAP and other providers that describe 2 Tier ERP strategies and solutions that mostly all revolve around a hybrid strategy of complementary SaaS / ERP / Best of Breed capabilities around an existing deployed, customized and working ERP Core.

Now let’s move on to consider what are the essential strategies and capabilities a typical large Enterprise client needs to develop to be successful in a postmodern ERP environment and information age.

A number of these data management capabilities are described in a “Stretch the three C’s” Context – Culture, Competency and Capability” in a recent Forrester Paper by Brian Hopkins, published on January 10th 2018 – “Stretch Your Data Management Capabilities”

Also I’d recommend the following Blog, by the General Manager of IBM’s Analytics Business, Rob Thomas which uses a motor industry analogy to describe the logical steps and prerequisites that are required to successfully navigate the AI (Artificial Intelligence) and Machine Learning (ML) Ladder.

With a guiding principle being to gain increased business insight, value and/or process automation (or RBA) there is simply put no AI or ML without a clear Information Architecture (AI) and Information Management (IM) strategy, plus a step wise logical approach to developing capabilities and maturity working step wise up the AI/ML maturity ladder.

Hence in terms of a Line of Business and/or IT Executive recommended course of action in this context would or could then be an appropriate combination of:

  1. Avoid being drawn down the complex re-implementation of back office ERP function spiral or Vortex..

    Unless it is a natural time for a complete refresh of your ERP platform / application template and aligned back office business processes vs an “optimize of what you already have strategy”.

  2. Find way to increase the efficiency and optimization of your existing ERP, SAP Business Suite / NetWeaver platform investments to free up IT Investments for:
  3. Increased Front Office, System of Engagement, Insight and Digital Innovation investments.
  4. Enabling new value creating Cognitive solutions via IBM Watson API’s services integration.
  5. Build out a clear Information and Data Management Strategy as a foundation for:
    • Artificial Intelligence (AI) and Machine Learning (ML) enabled Business Insight & Value

      ..following a logical step wise AI / ML “Ladder Strategy” and capability development.

    • Also as a sound information foundation for continued Process Automation (and/or RPA)
    • Develop an effective Information Architecture (AI) and Information Management (IM) and Data Governance and Security (DG&S) strategy

      Understanding for example that Data Lakes projects, without a parallel and sound information management and governance strategy typically fail to deliver the expected ROI, unless this aspect is properly addressed upfront (in particular within regulated industries and/or for example where GDPR clearly applies).

    • Look closely at providers within your Information Management “Eco System” do they offer a combination of flexibly choice, scale with platform integration vs point solutions
    • Enable a clear, Open Source enabled and leveraged strategy to effectively combine both “Big and Little Data” sources (Data at Rest + Data in Motion, SQL Schema Before & After)
    • Employ tools and methods (like IBM’s Data First Method) to define existing Information management maturity and required target capabilities aligned to specific Line of Business (LoB) sponsored and value creating use cases.
    • Develop parallel Application API / Microservices and Information Integration and Management Strategies and architectures (the twins represented by the top and bottom half of figure 1)
    • Ensure that ERP vendor driven proposals that re-engineer often working ERP solutions have realistic and accurate ROI vs risk vs cost / investment profiles.Refer Note *

    • Look closely at solutions that balance a standardized Core ERP processes with locally optimized front office process automation as per the following diagram.(apologies for the old style SAP logo, but this is not a new strategy, also as per the prior Veissmann IBM SOA + SAP White Paper I previously referenced)

      Pendulum Screen Shot 2017-06-08 at 00.57.20 copy

      There is a good case study / example of striking this balance at Carlsberg

  6. Enable a clear, flexible Dev/Ops platform management strategy and API/Microservices integration capability (inner / outer ring ESB / API integration hubs)  Refer my prior Blog on this topic

Note * We have observed that significant ERP re-engineering projects can become all engulfing in focus and IT budget terms may take and cost x2-x3 times the initial budget & time estimates.

Sources of information researched for this Blog item include, but are not limited to:

  • The road to AI leads through information architecture – Rob Thomas IBM GM Analytics
  • Gartner Paper (G00311163) “Schrodinger’s Cat: How ERP is both Dead and Alive”Published on the 28th of June 2016.

    Authors: Denise Ganley, Coral Hardcastle plus various related Gartner ERP platform and market related papers.

  • SAP S/4 HANA as the Digital ERP Core for organisations on a Digital Reinvention journeyMark Dudgeon, IBM CTO SAP IBM Global Services – January 5th 2018

    IBM offers a S/4 HANA Impact Assessment Service as follows

  • Forrester – Stretch Your Data Management Capabilities by Brian Hopkins, Jan 10th 2018Continuous Improvement: The Data Management Playbook
  • IBM’s Institute for Business Value – The Software Edge to drive competitive advantage
  • SAP S/4 HANA: From 2 Tier ERP to the N Tier Enterprise – Fall 2016, Joshua Greenbaum
  • Forrester – It’s time to clarify your Global ERP Strategy. Dec 2010, George Lawrie

Also I would refer readers to an excellent IBM Institute of Business Value study that looks at Innovation in an API Economy which can be found here.

Open Source and Standards driven enabled and business process optimization

My prior Blogs, LinkedIn items on:

API / Microservices “Inner / Outer Ring” ESB / Messaging and Application Integration

Via IBM’s API Connect, Node RED, IBM Watson IoT Capabilities

Including the following related Dual Speed IT Summary approach

Dual Speed IT Strategy v1 170717

The opinions within this blog are the authors, they do not represent a formal IBM corporate point of view, copyrights are respected and/or sources acknowledged and referenced as and where applicable.



Developing flexible, business aligned IT innovation and capability delivery strategies

After my prior LinkedIn item on when does SAP S/4 HANA make sense for your business?

I received a question through a colleague in respect of a large European client who was effectively defining a Choice A over Choice B forward IT strategy.


This prompted me I to sit down and put some further thought into a IT Executive view of a business into IT strategy model and approach that could result after an Enterprise client makes Choice A.

A Business Domain / IT Transformation Viewpoint

Getting straight to the point, as a direct consequence, I then sketched out the following high level architectural thinking and strategic business into IT building block based approach (having further researched various recent IBM C Suite and IBM Institute of Business Value (IBV) studies) that places an effective API and Enterprise Service Bus “inner / outer ring” strategy as the key enabling capability.

An Open Business Domain : IT Transformation Viewpoint v2 180717
It was also interesting to read a recently (31/01/17) published summary of a study by the DSAG (German Speaking SAP User Group) which in addition to discussing various S/4 HANA adoption rates amongst the surveyed SAP DACH clients.

This study highlighted plans to significantly increase forward IT investments and focus in front office “digital transformation and business / IT enabled innovation projects” within the surveyed German and Swiss speaking / based companies (similar to Panalpina) leading in this area, with up to 60 and 70% respectively of funding being targeted at these critical area’s and “competitive advantage” or simple “stay in business” IT transformation and investment requirements vs disruptive adjacent industry or new physical “asset lite” digital disruptive competitors (Netflix vs Blockbuster retail outlets comes to mind) .

It is possible to say that a number of industries including global logistics companies naturally already have complex extended digital supply chains which are facilitated by standards based messaging, API’s, data exchange with supporting back office EAI (Enterprise Application Integration) platforms and web enabled consumer and partner .

After creating this high level model and approach I naturally started to think further the critical business into IT capabilities that are required to ensure success when implementing a business aligned IT innovation strategy of this type.

When I net these out, to me they seemed to distill into six or seven strategic IT capabilities and imperatives in combination with the profile of the individual business appetite from a forward risk / reward and rate of IT innovation and management / adoption of change point of view.

These included the business aligned development, governance and practical implementation of:

  1.  An effective API development, brand, portfolio management and delivery strategy
  2.  An effective data governance and management strategy to pool, integrate and actively manage data for trusted, timely and accurate business insight
  3. An existing, defined and working ESB (Enterprise Service Bus) application messaging platform or platform/s (I refer to this as an “inner ring / outer ring” ESB / EAI strategy)
  4. The appropriate and targeted selection of buy (& Integrate) vs build (Dev/Ops) and run application capabilities
  5. Aligned Business and IT Executive sponsorship, funding and organization / cultural factors
  6. Structured, thoughtful analyze to select the most appropriate IT capability building blocks
  7. A plan to explore, target and integrate cognitive computing, ML / AI capabilities

Crucial also is timing, as various new and emerging technologies typically traverse the Gartner “technology hype” curves (or Forrester Waves) at different speeds, with specific vendors, technologies and/or platform/s emerging to becoming a or the “de-facto” standard or dominant provider in a particular function or area.

Within this context it’s clear that some fundamental and basic “table stakes” still apply including:

  1. A real, demonstrable, sustained commitment to Open Source offerings and capabilities
  2. The selection and integration of at scale, viable “top right” quadrant platforms, products and/or technology partners
  3. Appropriate Business into IT funding to phase delivery – Start small, prove, then scale fast
  4. Understanding when to buy vs build – The Factory IT vs Competitive Advantage IT question
  5. Understanding that multi source, highly cost optimized, outsourced IT strategies are relatively unlikely to provide a firm foundation for the agile delivery of new business into IT capabilities – as Business into IT Driven “Digital Innovation Requirements” become ever more critical

Which in turn relates towards the phased evolution vs revolution question that is described and nicely summarized in a GA Moore technology model below.

Where often within individual global enterprises various Business into IT delivery programmes will sit in different segments under the “cross the chasm” curve.

In a prior Imperial College Business Innovation course it was clear that the most successful and effective business innovation strategies and platforms (including the Apple’s iPhone), were in the majority of successful cases, actually combining proven prior individual technology components and eco system building blocks in new innovative ways.

It was the innovative new combination of these proven capabilities and technology building blocks often within new value based networks that creates the greatest and/or most disruptive business value, often not brand new or immature technology.

It was also interesting to observe a recent joint Schaeffler Group / IBM Watson IoT / Manufacturing 4.0 partnership announcement and youtube video that is grounded on a number of these principles as described below:


In addition to a recent LinkedIn CIO / Data Management forum item that nicely described effective Data Management and IoT strategies as the “King and Queen” partners of aligned IT Innovation capabilities in the complex game of chess that is successfully implementing viable, long term IT strategies.

If these are the King and Queen I’d also then say that Hyperledger and blockchain represent the Castles in chess terms, enabling swift directed movement combined with protection and security.

Additionally in my view as described in a short you tube video about IBM’s Hyperledger blockchain pilot system within IBM Global Financing (IBM IGF process $44 Billion Dollars of transactions, within a network of 4,000 partners, suppliers, shippers, banks).

Where the implemented open source based IBM Hyperledger solution provides an “individual client ledger” neutral, secure, immutable, auditable digital asset / document and transaction supply chain without seeking to force change the participating partners back offices platforms which is costly, risky and typically has extended cycle / lead times vs speed to value.

Approximately 10 years ago a number my IBM colleagues in our Consulting, Hosting and Global Services teams invested significant time and effort in a structured review of large scale complex IBM / Client project deliveries both successful (and a few unsuccessful) to help better inform future joint projects and joint IBM / Client success.

The output of that study is as valid now, if not more so now than before (in our Hybrid Cloud, Cognitive world), in that it logically identified and confirmed what we all know to be true, but is unfortunately often ignored or lost in the heat of an early project life cycle.

In particular these structured approaches become even more important as many large, medium and small Enterprise clients seek to successfully deploy and manage relatively complex “Hybrid Cloud” scenario’s.


The success of any significant IT initiative crucially depends upon the initial business aligned requirements definition and closely managing and defining the interfaces and hand offs between the different partners and functions that are described in a 10 box IT operating and innovation model.

With the 1st and most crucial box being the initial terms of reference and requirements definition box prior to developing a 9 box “design, build and run” model in 3 layers:

  1. The business transformation requirements, value into the application delivery layer
  2. The business application, integration and data management layer
  3. The IT Infrastructure, platform and IT service delivery layer/s

The success of this model and approach is then defined by the success (or otherwise) of carefully defining and managing the interfaces between the 9 boxes in people, culture, technology, funding, capability teamwork and strategic terms as follows:


One of my Client IT Architect colleagues working in the Retail and Consumer products industry also recently highlighted that it’s never been more important to manage these interfaces effectively to avoid the unwelcome emergence of the “IT to IT Gaps” that will inhibit successful delivery, in combination with the critical success factor of selected and assembling proven building blocks in new innovative ways that is at the heart of the most successful Business into IT innovations:


Where the basis rule applies more often than not, assembling proven capabilities and building blocks (using a Lego like analogy) will typically yield more predictable and effective outcomes.


I hope this item is helpful, in highlighting the requirements and prerequisites for successful “Choice A vs Choice B” business into IT innovation delivery.

IBM provides a combination of proven scalable, virtualized building blocks for Enterprise scale SAP Hybrid or Private Cloud platform delivery including DB2 v11.1 LUW, IBM POWER8, AIX, PowerVM, Linux, IBM System z, DB2 and/or Linux One, System I with DB2.

Disclaimer: The views expressed by the author in this blog reflect 33 years of experience in Enterprise IT and ERP / Application platform delivery are my own and do not represent formal IBM views or strategies. Vendor trademarks are respected






In-memory marketing hype vs reality – Hype Busting

This section 5, let us briefly looks at some – In-memory marketing hype vs reality

To see if these really stack up and what alternatives exist for clients who are worried about the disruption, maturity, risks and commercial lock in of the new SAP S/4 HANA, SoH and/or SAP BW HANA platform strategy.

This section could also be called a degree of “Hype busting” as we likely need clearly separate the excellent and pervasive marketing from the technical and solutions deliverable reality.

Is SAP HANA your destination ?
For the more technical minded reading this item, we shall now drop into some relatively technical discussions related to relational databases and systems design, I make no apologies for doing this as it’s likely important to help reset or gently correct a number of the relative benefits and themes that are normally associated with SAP HANA and/or S/4 HANA “Digital Core” presentations including at recent Sapphire and/or SAP TechEd conferences.

Where are we now in my view with respect to SAP S/4 HANA adoption rates vs a Gartner Type Hype curve:
Gartner Hype Curve


In this case I’ll use IBM’s DB2 SAP optimized data platform as a point of reference, it’s not that Oracle 12c SAP “AnyDB” platform choices don’t share a number of similar capabilities (I’d naturally say we do it better, more efficiently etc), it’s just that it would be rather technically presumptuous of me to try and represent Oracle’s 12c in-memory cache capabilities without sitting down with them to understand the capability of Oracle 12c and their ongoing development roadmap vs SAP HANA in greater detail for SAP NetWeaver and/or SAP BW 7.x workloads.

Assuming SAP SE commercially actually want to best leverage these AnyDB and/or enable these capabilities (on not), hence I won’t attempt to do this in this item.

 “In-Memory” Columnar Myth / Hype Busting – Number 1

Firstly I know it sounds obvious but all databases run in computer memory, we are really simply discussing if the database is organized in a columnar relational form (ideal for analytical / OLAP “multi SQL select” read orientated SQL workloads) or if it is organized in a row relational form that is typically used demanding transactional (OLTP) workloads with higher volumes of “single SQL select, insert, update and/or delete” and/or often row based batch updates, let’s call these more traditional read / write OLTP workloads.

Where 70/30, 80/20, 90/10 read / write ratios are common, with higher write ratios typically often observed for demanding OLTP, batch, planning (SCM) and/or MRP manufacturing workloads.

Indeed the IBM DB2 10.5 BLU “In-memory” columnar capabilities are named after a IBM Research Project at IBM’s US West Coast based IBM Almaden Labs called “Blink Ultra” in 2007 / 8 which effectively observed that by converting prior relational rows to columns in memory, that up to x80 times SQL query reporting times speed up were observed for more demanding OLAP / SQL analytical queries.

A detailed research paper from Guy Loman and his team in IBM Almaden from 2007/8 can be found here, if required.

It’s also true that with DB2 LUW (and/or DB2 on Z/OS) that IBM has spent many years optimizing the use of relatively moderate amounts of DB2 database cache (called DB2 Buffer Pools) and systems memory to provide optimal throughput with justifiable levels of systems platform memory investments, whilst persisting data to disk / SAN Storage and also sustaining ACID database transactional consistency.

Hence the idea that any one vendor has a technology unique in this area is largely marketing hype from my point of view, for sure a particular vendor has marketed this capability very effectively, whilst IBM has been less effective with the marketing and likely more effective with an evolutionary, non disruptive deliverable.

For examples of this DB2 + SAP BW deliverable, refer to a couple of summary you tube videos at Yazaki (a large privately owned Japanese manufacturer of custom auto / car wiring looms) and/or at Knorr Bremse a large manufacturer of advanced braking systems for trains etc.

Yazaki and Knorr Bremse – SAP BW plus DB2 10.5 BLU videos

In-Memory “Commodity Computing, Multi Core is cheap” Myth / Hype Busting – Number 2

DB2 10.5 LUW (Linux, Unix, Windows) has been optimized to take advantage of the more recent multi core processor architectures, including both Intel Xeon and POWER (AIX, Linux, iOS) based architectures whilst offering a choice of Operating System support with ongoing SAP ERP / SAP NetWeaver 7.40 and 7.50 certification, optimization and support through to 2025.

If for example we consider the proven and mature “Symmetrical Multi-Threading” (SMT) capabilities of the IBM PowerVM Hypervisor with either AIX / Unix and/or Linux, these proven capabilities have have been extended over time to provide options to switch between one, two, four or eight threads to best match the application workload instruction flow that are then assigned and executed on multiple CPU cores (up to 12 per socket).

This helps to both increase application throughput and increase IT asset utilization levels.

Indeed in recent IBM Boeblingen Lab tests with DB2 and BLU we tested the relative benefits of SMT 1, 2, 4 and/or 8 for a SAP BW 7.3 and/or 7.4 analytical workload, it was clear during these tests for this particular workload SMT 4 provided an optimal balance of throughput and Server / IT asset utilization (CPU capacity, cycle & thread utilization) whilst avoiding excessive “time slice” based hypervisor thread switching that can significantly hamper the throughput of alternative less efficient hypervisors serving the Intel / Linux or “WINTEL” market demand.

Typically with IBM’s POWER for both DB2 10.5, v11.1 and/or HANA on POWER we observe a typical x1.6-x1.8 times greater throughput per POWER8 core (vs alternative Intel Processors), which is supported in balanced systems design terms by roughly x4 times the memory and/or IO throughput compared to alternative processor architectures.

For example if you have a demanding SAP IS Utilities daily, monthly or quarterly billing batch run for tens of 1,000’s of your Utility clients with SAP ERP 6.0 / SAP NetWeaver, the combination of DB2 10.5 and POWER8 with AIX 7.1 (and/or Linux) is really very hard to beat in batch throughput, availability, reliability and delivered IT SLA / Data Centre efficiency terms.

In parallel considerable and ongoing DB2 development lab efforts have resulted in DB2 10.5 SAP platform solutions that also fully leverage modern “Commodity” Intel based multi core cpu architectures, hence this is not a SAP HANA rdbms unique capability by any means.

During mixed SAP or other ISV application workload testing it’s true to say that some ERP / ISV applications better exploit multi-threaded CPU architectures and modern OS Hypervisors than others.

This remains as true for various SAP Business Suite / SAP NetWeaver (or indeed S/4 HANA) as other ISV workloads, where multi-threaded application re-engineering and optimizing typically takes many months and/or many man years of effort, indeed at one SAPPHIRE (2014) Hasso Plattner (Co-Founder and Chair of the SAP Supervisory board) reflected on the significant and ongoing effort to re-optimize many millions of lines of ABAP code in the existing SAP NetWeaver core platform for S/4 HANA, in addition to the subsequent CDS “push down” initiatives briefly mentioned before.

Also as previously mentioned in my prior Walldorf to West Coast blog, I’m rather reserved about the later upgrade complexity and costs I’d previously observed in a Retek / Oracle Retail scenario which pushed retail merchandising replenishment (RMS) functionality down from the Clients specific application configuration through the Oracle Application tier into the Oracle 10g rdbms tier leveraging PL SQL stored procedures.

For sure this helped to speed up key replenishment batch vs prior IBM DB2 or IMS based M/Frame platforms, however with the later penalty that the overall Retek RMS or WMS solution stack became very tightly coupled and interdependent in version terms.

It also essentially limited (like SAP HANA) the Oracle Retail / Retek platform rdbms choice to one only, where later application version upgrades were really very significant “re-implementations”, conversely the prior segregation and separation of application and rdbms duties in a SAP IS Retail / NetWeaver helped to reduce on mitigate this issue.

Hence in this case the structured development and enablement of SAP’s Core Data Services (CDS) interface between the application and deeper database functionality becomes vital for SAP clients

It’s also true to say the functional depth and breadth of capabilities being built into SAP HANA is very impressive, however this does mean a high rate of change, patching and version upgrades that in turn will need to be aligned to Vora / Hadoop platform versions.

In an Intel environment DB2 10.5 and/or 11.1 LUW also naturally also leverages Intel / Linux and/or Windows “Hyper Threading” (typically dual threads per physical processor core).

In my view the myth here is that “per say” Intel multi core architectures are inherently cheaper than alternative mature, Type 1 (or type 2) hypervisor implementations on IBM’s POWER or IBM System z (refer to this item for a summary of the difference between Type 1 and Type 2 hypervisors)

For example, within IBM we internally consolidated many 1,000’s of prior distributed Unix / AIX / Linux systems and applications onto a limited number of large IBM System z servers running Linux with a highly efficient and mature type 1 hyper visor, this was in fact significantly cheaper and considerably more efficient in Green IT and DC PUE terms, than alternative distributed computing options.

I’m not saying here that Intel / VMware ESX or Linux based hypervisor solutions don’t also provide considerable IT efficiency and platform virtualization opportunities, they do, it’s just that I rarely favour “a one size fits all” binary IT platform strategy.

In my experience a single platform strategy rarely does (for the largest Global Enterprises, it’s likely rather different for small and medium sized enterprises).

Typically implementing “a one size fits all strategy” often forces rather uncomfortable compromises for very Large Enterprise scale clients, who often naturally both virtualize and tier out their server and storage platforms (increasingly also in Hybrid Cloud deployment patterns) to match the requirements of different workloads and/or delivered business driven IT SLA’s and real life practical delivered TCA / TCO’s and cost / benefits.

For sure it’s relatively easy to compare an older or partially virtualized Unix / Oracle environment with a fully virtualized x86, Intel, VMware / Linux scenario (or Intel / Linux Cloud) and demonstrate TCO / TCA savings, however theses often tend towards being potentially rather misleading “apples & pears” comparisons vs comparing one rdbms platform under load against another on the same platform and operating system for the same set of OLAP or OLTP workloads (a much more balanced comparison).

Where the intense IBM focus is really on the most efficient use of the available systems resources (cores, memory & IO) in combination with increased IT agility and responsiveness to help optimise Enterprise Data Center efficiency (some call this Green IT) whilst minimizing the required input power (often measured in Mega Watts) for larger DC’s as measured by the Data Centre Efficiency ratio (PUE).

In this area with the significant consumption of many GB and/or TB of Ram and/or many thousands of cores (for large Enterprise SAP Landscape deployments) the SAP HANA architecture can be very costly indeed in DC Efficiency terms, in particular with limitations currently associated with the virtualization of on premise HANA production environments.

In-Memory “Data Compression Rate and TCO Savings” Myth Busting – Number 3

With DB2 10.5 “Adaptive and Actionable” compression we often observe and sustain 75-85% rates of DB2 DATA compression (call it a 5:1 compression ratio).

In particular with DB2 BLU columnar conversion of targeted SAP BW tables we leverage advanced Huffman encoding, in addition to significantly reducing the requirement for aggregates and indexes resulting in compression rates of 85-90% or more (vs prior uncompressed base lines) depending on the specific nature of the clients SAP BW 7.x tables.

For SAP BW with HANA ratio’s of 3.5 to 4:1 vs uncompressed maybe typically observed (client data depending etc).

Hence in these scenario’s Clients implementing SAP HANA columnar strategies will actually likely observe a reduction in compression rates if they are already using either DB2 10.5 Adaptive and/or DB2 10.5 BLU actionable compression with SAP BW 7.x.

In addition to “Doubling up” the required memory for SAP HANA working space, whilst sizing a combined SSD / HDD (Solid State or Hard Disk Drive Storage) at FIVE times compressed data for HANA database persistence (x4) and HANA logs (x1).

In these scenario’s that client will actually observe a significant net increase in SSD / HDD or SAP HANA TDI based SAN based storage capacity, not a reduction as often claimed in SAP HANA marketing presentations and brochures, in particular when these differences are then multiplied up over multiple SAP environments (Dev, QAS, Production, Dual Site DR, Pre-Production, Training etc for a real life SAP Landscape (operating in either a dual or single track landscape on the path to production from Sandpit / Development, through QAS, Pre-Production, Regression to Production).

Naturally for both SAP HANA and/or indeed DB2 10.5 BLU with SAP BW clients can complete older data house keeping and/or BW NLS archiving, in the case of DB2 BLU using a common BLU NLS archiving capability, for SAP HANA + BW it’s SybaseIQ based BW NLS archiving currently, potentially Hadoop / Vora in the future.

This diagram is often presented at SAP Sapphire and/or TechEd conferences to summarize the target, potential SAP HANA storage savings with Suite on HANA for SAP’s own ERP deployment.

HANA Storage Reduction Screen Shot

However what is rarely mentioned is that ~ 3.x TB of older table data from a prior Oracle to DB2 (over HPUX, Superdome) SAP ERP migration was removed “cleaned up” in house keeping terms in advance of the ERP on HANA migration, this then gives very different picture in terms of the realized data compression rates vs a prior partially compressed older DB2 environment, however this would then really spoil a good story, SAP HANA marketing chart!

Naturally the required additional new hardware capacity investment that is required is then of significant commercial interest to the multitude of Intel / SAP HANA SAN platform providers (either HANA Appliance and/or consolidated TDI SAN based).

For example I created this simple chart to reflect the relative SAP DB2 BLU and SAP HANA memory sizing ranges, understanding for both technologies it’s also individual client table data dependent.

Relative SAP HANA vs BLU memory sizing

In my experience this seems to have turned into a bit of a server and storage hardware vendor feeding frenzy with multiple h/w vendors rushing to endorse the whole SAP S/4 HANA adoption story for obvious reasons whilst largely ignoring prior existing, proven, incremental SAP platform solutions!

In-Memory “Future Optimization, SAP Roadmap” Myth Busting – Number 4

In many SAP Enterprise Client engagements, I receive the following comments “but we have been advised that we will miss out on future SAP application optimizations if we don’t migrate to a SAP HANA rdbms and/or S/4 HANA “Digital Core” sooner rather than later”.

These comments are often made irrespective of the actual, real life S/4 HANA adoption rates that are actually a very small fraction (= > ~ 1%) of the installed SAP Business Suite / NetWeaver installed base, such is the largely sales incentive driven pressure for SAP sales and technical sales teams.

At best this is only partially true, where SAP continue to enable and develop a “Core Data Services” (CDS) rdbms abstraction layer that creates a logical structure for the push down and optimisation of SAP HANA “re-optimized” application code to the rdbms database tier.

Consequently and logically IBM with DB2 (and indeed Oracle with 12c) continue to develop, optimize and align DB2 capabilities to SAP NetWeaver CDS functionality, which incidentally is supported and certified with SAP NetWeaver 7.40 and 7.50 with DB2 10.5 (& above) through to 2025.

Additionally  typically for IBM Financial Services Clients CDS has been deployed in conjunction with FIORI transactional applications to significantly improve SAP usability whilst protecting the Clients investment in IBM’s System z and/or DB2 over z/OS with Linux or AIX SAP application server capacity.

In practical terms this means that ongoing SAP HANA based SAP ABAP code re-engineering and optimization efforts (there are many many millions of lines of single stack ABAP and/or prior dual stack ABAP / JAVA code) are aligned via CDS to optimized rdbms alternatives like DB2 and/or Oracle 12c in the near and mid term IT investment and planning horizon.

At Sapphire NOW 2016, I picked up a number of initial comments that the “Suite on HANA” SAP HANA Compatibility Views would only be developed and sustained for a finite period (until 2020), allowing Clients more limited time to migrate to new simplified SAP S/4 HANA Enterprise Management code steams and table structures (the new Universal Ledger in Simple Finance as an example).

From a personal point of view, deploying an existing deeply customized regional or global SAP NetWeaver / ECC application template that has been “Read / Write” optimized for existing rdbms platforms over many years over HANA (SoH) is likely an application and rdbms platform mismatch.

It’s likely more logical to implement a new simplified S/4 HANA Digital Core “read optimised” application template with over a HANA columnar rdbms platform. This assumes the require application functionality is available and that the business is willing to remove or remediate prior customizations to align to a forward SAP S/4 HANA digital core roll out and transition strategy.

However it is also becoming clear now that in addition to the prior SAP Business Suite / NetWeaver code line (and various PAM defined OS/DB supported combinations) the SAP HANA initiatives have created at least 4 if not more different SAP S/4 HANA “simplified” code lines or releases including:

  1. Simplified S/4 HANA solutions hosted on the HANA Enterprise Cloud
  2. The prior S/4 HANA Simple Finance (sFin v1) code, maintenance and release line
  3. S/4 HANA Enterprise Management and Simple Finance v2 “On Premise” code & release line
  4. The S/4 HANA Enterprise and Simple Finance “On Premise” code line but HEC hosted

The clear risk for both SAP SE and/or SAP Enterprise Clients is that there is simply a switch from developing, managing, testing and releasing multiple “Any DB” OS/DB choices over a SAP Business Suite / SAP NetWeaver code stream to managing, aligning and releasing multiple S/4 HANA editions and code lines (on or off premise), this is just a different set of complexities to manage, but with a new restriction of prior Client “AnyDB” choices, this is not in my view SAP HANA “simplification”.

S:4 HANA Simplification ? 

In-Memory “Commodity / Cloud Based TCO Reduction” Hype Busting – Number 5

In our industry we are observing the convergence of multiple significant structural changes, where previously we would typically deal relatively speaking, with a single significant structural change every 3-5 years (Desktop Computing, Client / Server, Distributed, the emergence of Eclipse, JAVA, Linux Open Source etc).

Today we have to manage and prioritize limited IT investment resources over multiple concurrent significant structural changes (Mobile Devices, IoT, Public / Hybrid Cloud, Big Data, significant Cyber Security threats), some of us older folks with many years in IT (and a few grey hairs), might suggest some of these themes are a being a little “over hyped” in IT Industry fashion terms, hence we tend to take a cautious view, then asking the harder “but, so what questions ?”, helping to sort out material delivered benefits, ROI and progress from the considerable IT industry hype (it is a bit of a fashion industry also !).

In my view it’s perfectly possible to architect, build and deploy an “at scale” fully virtualized SAP Private Cloud that is every bit as efficient ( if not more so in Data Center Efficiency / PUE terms) than either a Hybrid Public / Private cloud based on AWS (Amazon Web Services and/or MS Azure) platforms based on Intel Commodity “ODM (Original Design Manufacturer) 2 or 4 socket servers.

Indeed the author was directly involved and responsible for the successful deployment of a fully virtualised IBM DB2 SAP Private Cloud in support of ~ 8 Million SAPS, 600+ Strategic SAP environments with ~ 12 Petabytes of fully virtualised and tiered SAP storage capacity spread over Dual Global Data Centre’s with WAN Acceleration to support prior SAP GUI, SAP Portal and/or Citrix enabled SAP Clients leveraging DB2 and PowerVM, AIX, where in practical terms it remains a highly efficient, flexible and scalable SAP platform in support of a 50+ Bn Euro (~ $75 Bn annual t/o) Consumer Products business.

In this case, as mentioned before briefly in a prior blog section, when we completed detailed modelling of a SAP HANA Appliance based deployment over 4 regions and 4 at scale workloads / SAP Landscapes (ECC, APO/SCM, BW, SAP CRM) with dedicated production appliance and VMware ESX / Intel virtualized capacity for smaller non production SAP HANA instances with a shared, common TDI based storage strategy, this carried a DC TCA (Total Cost of Acquisition) premium over the existing Virtualised, Tiered IBM DB2 SAP and IBM POWER deployment strategy of between 1.5 and 1.6 times.

On one SAP HANA video a x10 landscape capacity reduction was indicated, however this really did not correlate in anyway with the actual worked example mentioned above.

For sure, I would not debate the agility, flexibility and initial responsiveness (assuming the required VPN links and security , data encryption needs are met) of AWS, MS Azure and/or indeed IBM’s own SoftLayer Cloud offerings for rapid provisioning of Dev/Ops enabled “Front Office, Big Data and/or next generation Mobile Enabled application workloads including S/4 HANA or indeed SAP NetWeaver with DB2 10.5 and/or CDS which is also available on MS Azure, AWS and/or IBM’s SoftLayer / CMS4SAP platforms.

The crucial factor here is a proper base line and measurement of the “before & after” environments and to avoid the considerable temptation to compare different “apples & pears” generations of SAP platforms that rather “mixes up” the whole TCO analysis and results equation.

I consistently observe Cloud TCO comparisons of prior “Legacy” partially virtualized older generations of Unix / rdbms systems with fully virtualised Intel x86 Cloud environments, these types of old vs new compares can be rather misleading and should in my view, be taken with a large and rather cynical pinch of salt.

Any comparisons TCA / TCO should really use “like generation” CPU / Virtualization platforms and virtualised, tiered storage combined with current generation rdbms platform choices. For example comparing an older version of Oracle (or indeed DB2) over a prior Unix platform generation with a fully virtual x86 Cloud with initial development SAP HANA + SAP BW (including any risks of noisy neighbour, unless dedicated capacity is deployed) scenario can be very misleading whilst potentially creating impressive but also potentially rather misleading headlines during Cloud vendor marketing events and presentations.

In-Memory “IT Agility, Sizing, Solution Responsiveness” Hype Busting – Number 6

After many years of SAP and/or ERP Platform sizing experience, we all understand that sizing complex SAP Systems Landscapes is a combination of science (user input on expected user, transaction volumes, data volumes and expected user & data growth rates, expected roll out rate and planning horizons, workload scalability testing, Client specific PoC’s etc).

Which is then combined with detailed prior experience and judgement on the likely system sizing variation and future growth rates after SAP Application configuration and customization, along with catering for the typical often changing business requirements and/or fluid ERP / SAP roll out schedules by country or region over different SAP ERP and/or related non SAP systems alignment and integration requirements.

In this context it really nets out to one of two sizing strategies in particular if SAP HANA appliance vs TDI strategies are being considered.

  1. The “Appliance based model”Define the target environment, future growth horizon and then add a safety margin for errors, unexpected changes in inbound demand (an increasingly frequent issue)Then you deploy the targeted 2, 4, 8 or more socket / server appliance building blocks with the appropriate data compression rates and GB / TB of Ram sizing methods
  2. An On Demand (In IBM we call it Capacity Upgrade on Demand – “CUoD”) ModelWhere you size a scalable platform with Active live and/or “Dark” CUoD capacity that is then activated “on demand” when the actual workload requirement is known vs initial SAP ERP sizing estimates.
  3. Then on top of these 2 models or approaches you then consider the realistic IT / ERP platform technology / capacity refresh cycle vs expected roll out schedules, workload and data growth rates to ensure you don’t break the target capacity building blocks for peak vs average demand over a typical 3-5 years IT asset write down cycle.

These rules mostly apply irrespective of the SAP Solution Cloud deployment model (Hybrid, Public, Private) selected to match the various development and roll out phases (remembering a chart I defined back in Feb 2005 as below !) to describe this typical Enterprise SAP ERP workload and roll out cycle (just to prove some things don’t really change as much as we might imagine !).

Dynamic Infrastructure Sizing

One of my very experienced SAP platform solution architect and sizing colleagues, said that he felt that sizing SAP HANA appliance based landscapes (vs fully Virtualized System p + DB2) was a bit of a “back to the future” experience in SAP / IT platform sizing, server capacity and life cycle / refresh terms.

E.g. there are significant issues and penalties in capacity, disruption and and building block upgrade terms if the initial SAP HANA sizing is incorrect, as in addition to the typically 24-36 month refresh frequency on commodity / Intel x86 platforms.

This means that selecting the wrong sized SAP HANA appliance TYPICALLY means later rather uncomfortable conversations are required with at the CIO, CTO and/or CFO level when these need to be refreshed, often in advance of typical 4-5+ year Enterprise IT asset write down cycles and System of Record technology refresh terms.

In my view, it’s very important for these technology refresh cycles to be factored into any SAP Platform TCO /TCA analysis, in one prior large Retail scenario we used 3-4 Years for Intel / Linux, 4-6 Years for POWER / DB2 and 6-8 Years for M/frame System z, DB2 with either Intel Linux or Power AIX application server capacity, which aligned to the clients scenario and two of their 5 year fiscal write down / budgeting processes.

If you end up refreshing “commodity” technology or with a proliferation of different appliance based solutions (with large volumes of cores in the Data Centre to install, manage, power, cool and maintain with typical DC power to cooling ratio’s of 1.5-1.7 times) this can quickly become a rather costly and inflexible SAP platform strategy.

Personally I prefer to deploy a proven, scalable, flexible virtual platform upfront and then scale as required through Capacity Upgrade on Demand (CUoD) options. This helps to effectively manage business driven changes in requirements, unexpected mergers / acquisitions etc.

However if you have an existing workload that is stable, with clear growth rates and can deploy this over an appropriate appliance building block after a detailed PoC to help with sizing this can also work. It’s then really all about unexpected workload growth, which often driven by mergers or later acquisitions, disposals and/or business driven SAP platform consolidations activity.

Indeed, for example even last weekend I was reading about the continued significant rates of mergers, acquisitions and consolidation that is ongoing in the FMCG / Consumer Products Industry.

In these scenario’s suddenly finding your core SAP ERP “System of Record” platform now needs to scale by a factor of 3 or 4 times (vs 1.5-2 times) is actually not that uncommon as the back office functions for two substantive businesses need to be merged into a single SAP instance / template and platform to realize prior or committed merger / acquisition savings and economies of scale.

It’s for sure a case of buyer beware, the age old, golden rules of making sure your target ERP platform has at least x2 capacity headroom, has never been more true, if you “tight size it” it will for sure hurt later, please refer to the follow SAP HANA – 7 Tips and resources for Cost Optimizing SAP Infrastructure” Blog

For sure Cloud / IaaS based models can help with initial project agility, responsiveness and/or even to help actually size “model” configured environments, but per say it’s still important not to simply assume “a Cloud / Commodity” model is always cheaper than an effectively designed and deployed, virtualised “Private Cloud” or hosted “Private / Hybrid Cloud” model, in particular if you are implementing at scale over a 4-5+ year write down cycle vs 12-36 months.

Disclaimer – This blog represents the authors own views vs a formal IBM point of view

The views expressed in this blog are the authors and do not represent a formal IBM point of view.

They do represent an aggregate of many years (20+) of successful ERP / SAP Platform deployment and IT strategy development experience that is supplemented with many hours of reading, respective DB2 and/or SAP HANA roadmap materials and presentations at various user conferences and/or user groups, in addition to carefully reading input from a range of respected industry / database analyst sources (these sources are respected and quoted).

SAP NetWeaver Core + Best of Breed / SAAS Strategic IT Alternative Investment Choices?

In this case the focus is on speed to value and business into IT driven competitive advantage, back in a full circle to Choice A vs Choice B again.

This leaves Enterprise IT decision makers and Enterprise IT architects facing the following choice:

SAP Digital Core Propensity v2
Whilst this diagram looks complex and multi-dimensional in terms of its various axis and considerations it’s really relatively simple.

With the Enterprise Client mapping their planned “as is” and “to be” position into the “two cheeses” (the evolutionary, hybrid green area or the more revolutionary and potentially more disruptive all SAP S/4 HANA Digital Core all orange area).

Essentially the client has to decide where and when he maps into these choices, from an “as is” today and “to be” in the future perspective.

Another, but similar way of looking at the choice now facing Enterprise IBM SAP Clients is as follows which is essentially another view on the initial Choice A vs Choice B diagram:

Simplified Choice ?

Basically SAP NetWeaver Enterprise ERP Clients are now being asked to make a rather complex and difficult choice between investing their typically limited IT resources in a SAP S/4 HANA “Read Optimised” back office custom template remediation, simplification and transformation that is typically combined with the integration of SAP centric front office / SaaS solutions (SuccessFactors, Ariba, Hybris, Concur, Fieldglass) via SAP HANA Cloud Integration (HCI) and/or the HANA Cloud Platform (HCP).

For some SAP / IBM Enterprise Clients this is a logical and good choice, in essence then their IT Strategy is SAP S/4 HANA Digital Core and Extended “All Orange” aligned  (in effect as summarised by the CIO of Nestle at Sapphire Now, Orlando 2016 in the Day 2 key note with Rob Enslin, although he did comment that they were still having to pressure SAP SE into developing better integration of the S/4 HANA and new HEC / HCP hosted portfolio solutions).

Are you going for “All Orange” SAP S/4 Digital Core or a SAP NetWeaver Core + Best of Breed ?

However I’ve encountered in conversations with various CIO’s, CTO’s and/or Chief / Enterprise SAP architects I’ve noticed that some Enterprise Clients with often with constrained IT budgets actually prefer to leverage their existing SAP NetWeaver ERP 6.0 template and prior significant regional or global roll out investments whilst integrating “Best of Breed” Front Office Hybrid Cloud / SaaS based solutions via SOA based standards and API enabled integration buses, appliances and/or cloud based API integration services or vendors, to increase the delivered IT speed to value.
In these cases I’ve observed a noticeable switch from a prior SAP 1st back office IT investment priorities to leveraging an existing prior SAP ERP NetWeaver core (to realise a ROI from prior significant SAP roll out investments) to a SAP NetWeaver Core + API integrated Best Of Breed / SaaS Cloud based alternatives to deliver the speed to value that is increasingly being demanded by “digitally aware” Line of Business (LoB) users who typically have rather limited interest or time for large scale, complex back office “systems of record” transformation project IT investments.

Whilst facing these choices existing SAP clients may choose to read the SAP Nation 2.0 booked published by Vinnie Mirchandani, author of The New Poylmath.

On page XII in the first edition he describes with further segmentation and detaisl the various choices SAP Clients are now making. The Un-adopters, Diversifiers, Pragmatists and the Committed.

The views and choices expressed in my summary view in Section 1 really describes a combined Diversifier / Pragmatists as Choice A and the committed “all Orange” as Choice B.

Where I’m assuming practically reversing out of often significant prior SAP ERP / Business Suite investments is as painful (in prior sunk ERP platform investment terms) as going “all Orange” in terms of loss of future commercial leverage and the risk of IT vendor lock in vs faster open source technology innovation terms, as going fully committed and “All Orange” Choice B.

This prioritisation of strategic IT investments aligned to recent commentary from Philip Howard at Bloor Research ( who effectively summarised the relative CEO IT investment priorities from the 18th annual PwC CEO survey as follows (there are similar surveys from Gartner, IBM’s Institute for Business CxO, CIO etc surveys):

Strategically Important Technologies - Bloor and PwC

Additionally the following IoT / API Hybrid Cloud Architectures are now emerging as follows to integrate prior Systems of Record (SOR) platforms into API / IoT open platform enabled hybrid cloud architectures, with Docker Containers rapidly and strongly emerging as an Open Source container based technology to practically enable these architectures from an IT platform deployment and management perspective.

A summary of the trends driving the integration market and aligned IT architecture strategies are summarized below:

Trands driving the Integration Market

With an example of a architectural approach to addressing these trends as follows:

An architecture for Digital Business

Which aligns to trend number three as follows:

Integration Trend 3 Digital Transformation

In effect in this latter case the Enterprise clients are really betting on the higher rate of innovation that is typically observed over time in an Open Source environment and a API enabled SaaS / Hybrid Cloud scenario and/or community.

Disclaimer – This blog represents the authors own views vs a formal IBM point of view

The views expressed in this blog are the authors and do not represent a formal IBM point of view.

They do represent an aggregate of many years (20+) of successful ERP / SAP Platform deployment and IT strategy development experience that is supplemented with many hours of reading, respective DB2 and/or SAP HANA roadmap materials and presentations at various user conferences and/or user groups, in addition to carefully reading input from a range of respected industry / database analyst sources (these sources are respected and quoted).


Will Open Source Enabled Big Data, IoT / API Enabled Innovation prevail – YES or No?

In this section (4) we consider the question

– Will Open Source Enabled Big Data, IoT / API Enabled Innovation prevail – YES or No?

It is also clear that unless IT Functions embrace and lead in an API / IoT enabled economy we will continue to see the development of “Shadow IT” capabilities that are closely aligned to and embedded within the individual lines of business (sales, marketing, supply chain, manufacturing, distribution, multi-channel, partner enablement).

Indeed I believe we will continue to observe a switch from Business to Consumer (B2C) towards Business to Individual (B2I) insight based targeted enablement based on location, weather, preference, event, location insights (which follows IBM’s acquisition of the Weather Company) in addition to prior IBM alliances with Twitter, Apple and/or more recently Cisco in the IoT / Edge and Data Analytics area .

Indeed IBM already delivers solutions in this area with our Metro Pulse solution for Consumer Products Industry clients where multiple sources of un or semi structured Big Data (SQL Schema After) and/or Little Data (SQL Schema Before) are seamlessly combined with location, weather, preference, local event, historical POS data and promotional data etc to increase sales and product availability in “Metro” city based locations like London, New York or Singapore.
Also a high rate of innovation (and change) is currently being observed in the Big Data platforms and analytics solutions area where it seems that the majority of the Enterprise IT Architects and Clients I’ve spoken to are firmly committed to Open Source aligned Big Data solutions and platform choices. This then naturally raises the following question

Will Open Source aligned Big Data solutions eventually prevail?

In my view the answer is a 100% YES, although I also believe that a balance between open source driven innovation and large enterprise scale IT non functional requirements is required as summarized below:

Open Innovation

Indeed in the area of Proprietary vs Open Systems (at one time these used to be defined as Unix based Client / Server systems vs the IBM Mainframe), IBM previously tried a relatively closed and proprietaty approach when the IT market was rapidly transitioning towards Unix or “Open” distributed client / server platforms in the early 1990’s.
IBM subsequently / consequently suffered a near death experience in business terms as prior continued M/Frame MIPS platform capacity growth rapidly switched towards these alternative Distributed / Client Server platforms. Indeed SAP delivered SAP R/3 vs R/2 over the Mainframe with DB2 before to align with this “Open Systems” choice and market trend.

Although it is also true to say in more recent times IBM M/Frame MIPS capacity growth (combined with Open platform M/Frame Linux enablement) continues at a pace, often for mission critical systems of record / big batch scenarios.

Something rather similar happened in the PC market where IBM developed a technically superior but incompatible IBM PS/2 MCA (Micro Channel Architecture) as a follow on to the original IBM PC and/or PC AT IO adapter architecture.

Just as we technically turned right the rest of the market turned left with an ISA (Industry Standard Architecture) PC Input / Output (IO) adapter and bus strategy the rest is history as IBM’s PC Company went from having a largely dominant “IBM PC” market share to a significantly smaller share over time, is the same thing happening now in the core ERP / Systems of Record market?

As a direct result and subsequently IBM’s commitment and contribution to Open Source driven projects and innovation has been second to none amongst the major IT vendors, in summary IBM essentially previously learnt a very hard business into IT lesson in IT innovation and industry change terms.

This commitment includes significant investments and technical alignment to the following:

  • The Apache Software Foundation (1999) subsequently Eclipse (2001)
  • Linux (2007), OpenStack (2012), Cloud Foundry (2014)
  • jS (2014), Docker and the very significant Apache SPARK in-memory analytics operation system (2015) investment
  • In addition to the more recent innovative Blockchain based Hyperledger project (2016).

These commitments are in addition to the ODPi (Open Data Platform) Hadoop initiative are now both pervasive and very significant within IBM, indeed IBM recently published a paper that summarises this commitment and the resulting rates of Open Source driven innovation that in the longer term, in the view of the author will always eventually prevail over proprietary aligned alternatives no matter how big a single vendor or aligned partner eco system commitment.

Hence in my view, It’s not really a case of if, simply a case of when Open Source based innovation prevails.

Indeed in support of this viewpoint, Vinnie Mirchandani (in SAP Nation v1.0) mentions the success and growth of the Cloud Integrator Appirio with BoB / SaaS integration solutions and a large TopCoder community, in addition the rapid growth IBM is experiencing in the Bluemix and/or API Connect area’s.

Of course this Open Source commitment does not mean clients will not require a prior trusted solution partners to help them to safely bridge between their existing system of record and planned front office API enabled strategic IT platform investments, either Public, Hybrid Cloud or indeed prior Private Cloud / On-Premise often for mission and business critical data protection and/or privacy / IP reasons – It all starts with the data !.

The above mentioned paper can be found here, it nicely summarises the evolution of various Open Source platforms over time.

More recently one of the potentially most significant and innovative Open Source projects is the rapidly emerging Blockchain “Hyperledger” distributed ledger project that will in my view be truly transformative for many clients and industries.

Indeed I’d also recommend a rather detailed report published by the UK Government Office for Science, Chief Scientific Advisor, Mark Walport in December 2015 called :

Distributed Ledger Technology: beyond block chain, which can be found at:

Commencing initially with the Financial Services industry but then likely rapidly extending into other industries like Consumer Products and/or Discreet Manufacturing where complex extended and distributed supply chains and resulting financial transaction flows and ledger entries are the norm.

I’d also recommend the following short youtube video that describes the future impact of the this project in addition to this item on the Financial Times as follows:

Having covered some of the strategic IT investment choices in the above lets now dive back into the details of some of the “hype” and largely commercially driven pressure to migrate to SAP S/4 HANA and/or HANA OS/DB migrations (vs prior Oracle, DB2, MS SQL etc SAP AnyDB platform choices)

Now we move onto the final section in this series of blog entrances – In-memory marketing hype vs reality, section 5.

Disclaimer – This blog represents the authors own views vs a formal IBM point of view

The views expressed in this blog are the authors and do not represent a formal IBM point of view.

They do represent an aggregate of many years (20+) of successful ERP / SAP Platform deployment and IT strategy development experience that is supplemented with many hours of reading, respective DB2 and/or SAP HANA roadmap materials and presentations at various user conferences and/or user groups, in addition to carefully reading input from a range of respected industry / database analyst sources (these sources are respected and quoted).



Does a SAP S/4 HANA “Digital Core” destination make sense for your business, if so when ?

Executive Synopsis

At the SAP SAPPHIRE NOW 2016 conference in Orlando I was approached by a number of large IBM / SAP Enterprise Clients and business partners who essentially asked a similar and in theory relatively simple question, which unfortunately has both a simple and more complex answer.

HANA Bay, Maui, Maybe not be exactly the tropical beach and bay that you had initially imagined!

HANA Bay Picture

What was the synopsis of the similar question that I was asked and my initial and more detailed answer?

Which I’ve subdivided into a series of interrelated related blog topics for ease of consumption and particular interest, concerns and questions.

Section 1 – Does SAP S/4 HANA make sense for you, client Choice A vs Choice B ?

Section 2 – IBM DB2 BLU and/or DB2 10.5 Optimization for SAP – Evolution vs Revolution

Section 3 – Stable SAP NetWeaver Core + Best of Breed / SaaS Edge, Hybrid Cloud Strategy

Section 4 – Will Open Source Enabled API, IoT, Big Data prevail or proprietary – YES or no ?

Section 5 – In-Memory IT Hype vs reality – Some “In-Memory” Hype busting

Section 6 – HTAP, OLAP vs OLTP SAP Application Throughput, Optimizations

In summary for me individually referencing the diagram below Choice A is strongly preferred, however I fully recognize that some large Enterprises may essentially decide to go “all in” with a SAP S/4 HANA digital and extended core, essentially Choice B

On the basis that a picture is worth a 1,000 words, let’s start here:

Business into IT Investment Choices, Strategies v2 300816

A consolidated version of the repeated question I was asked at Sapphire Now 2016 ?

“Today we run our core, often customized SAP Business Suite / ERP 6.0 NetWeaver systems on
our preferred choice of IT platform, including an AnyDB choice, that embraces our choice of SAP platform technology.

(For example a choice of IBM iSeries, System z, System p and/or Intel x86 / Linux or Windows with DB2, or indeed Oracle with SAP over System p or Intel / Linux etc)

However our local SAP sales and technical sales team are strongly advising us to start out all over again in SAP platform technology terms with either SAP BW + HANA, Suite on HANA (SoH) and/or SAP S/4 HANA Enterprise Management with Simple Finance v2 running only on HANA over  Linux / Intel and/or IBM’s POWER and Linux with SAP HANA TDI storage.

What should we do and what’s your point of view and input?

PS Please don’t start your answer with, but naturally but it depends…. !

As a relatively conservative, risk adverse, experienced SAP / ERP technical solution architect and IT Strategy Advisor with 33 years of IT, ISV and ERP platforms solution experience, I will provide both a high level and then more detailed, structured and considered reply after time for more detailed research and to develop a more structured response and in more depth that I believe this topic requires and warrants.

Various sources of information and research used whilst creating this response includes various SAPPHIRE NOW 2016 and prior SAP TechED keynotes, respective SAP S/4 HANA and/or IBM DB2 SAP NetWeaver product roadmaps, technical solution benefits, choices, references back prior detailed SAP Platform TCO / TCA and IT risk / benefit / strategy analysis.

It includes a number referenced independent sources that are less influenced by commercial gain from automatically following the prevailing “the answer is SAP HANA, now what was the question again ?” viewpoint, hence maybe considered by some to be a little controversial.

The IT Executive Level summary answer is relatively simple as follows:

Business into IT Investment and Innovation Strategy Choice A or Choice B, which way are you going?

For me the start now Green Arrow Choice of Path A vs Path B is a hugely strategic and critical question for many enterprises, indeed the Harvard Business Review recently published an IBM sponsored paper titled “The Ecosystem Equation: Collaboration in the Connected Economy”.

This paper and a webinar presentation of its summary can be found at the following location/s:

If we net this excellent HBR research out, it really indicates that the next generation of industry leaders will be determined by the combination of consistent Executive “C Suite” sponsorship, investment, speed to market and value of digital enablement in a highly connected, open and collaborative strongly emerging “Digital Economy”.

In summary as follows:

HBR Connected Economy Summary

For me this research essentially indicates Choice A is likely a preferred path as it helps to focus typically finite Strategic IT investment resources more rapidly on delivered speed to value of Open Source, Analytics data / API and IoT driven platform innovation.

I’m not saying some Enterprises won’t choose Choice B, they will, however this choice will critically need to be made with an “opportunity time vs cost vs risk vs benefit” analysis of essentially pausing 1st to remediate existing customized SAP NetWeaver application templates towards S/4 HANA Enterprise Management templates, or as a minimum  investing in significant SAP application template remediation in parallel with front office business aligned IT innovation investment strategies.

Naturally as a consequence of this strongly emerging “Digital Economy” reality I then I tend to start my answer with further exploratory background IT strategy and SAP / ERP / IT Platform Client solution strategy related question/s as follows:

Are you planning to and can you practically implement a new “read optimized” simplified S/4 HANA Enterprise Management and/or Simple Finance v2 template with aligned optimized and revised, simplified business processes, Yes or No, or maybe even you are not sure yet?

Additionally, are you prepared to adapt the current business processes to match the capabilities of the S/4 HANA Digital Core / Enterprise Management package including Simple Finance v2 ?

Which for example is essentially a practical ERP platform strategy for relatively young but fast growing companies, for example like Asian Paints, in India whose CIO co-presented during one of the SAPPHIRE NOW 2016 Keynotes, in summary he indicated they had adapted and aligned their business processes to the available SAP solution capabilities and phased deliverables, not the other way around which is more normal in large scale, complex global enterprises.

Understanding that within the ~ 3,700 Clients that SAP SE indicate have adopted SAP S/4 HANA it was mentioned there are actually ~ 180 S/4 HANA productive deployments with a further ~ 300-350 pipeline projects, from recent Bloor Research and Nucleus Research analysis it also looks like a significant majority of these 180 clients tend towards “net new” SAP S/4 HANA deployments and/or early testing in smaller subsidiary operations of larger corporations vs core prior SAP Business Suite deployments.

Where ~ 180 + the pipeline of 300-350 further deployments actually represents ~ = > 1% of SAP’s ~ 45,000-55,000 installed SAP Business Suite clients.

Relatively speaking it was also recently mentioned that due to the bias of large, regional or global Enterprises running SAP over IBM DB2 approaching 1/3rd of SAP’s existing Business Suite transactions are actually processed on a IBM DB2 database platform running over a choice of IBM’s System z, p, I and/or Linux / Windows and Intel.

This represents many 1,000’s of installed SAP DB2 SIDS (SAP System ID’s) for both non production and often mission critical production use, and is consequently a low risk and proven SAP Business Suite / SAP NetWeaver platform capability.

Understanding also of the ~ 3,700 clients mentioned, this typically it includes a range of SAP BW on HANA (OLAP), Suite on HANA (SoH), HANA Side Car (CO PA, ML Accelerators) and/or S/4 HANA license upgrades in addition to things like SAP HANA HEC (HANA Enterprise Cloud) SuccessFactors, Hybris etc SaaS (Software as a Service) deployments, hence it’s very difficult to get an accurate and precise view.

As mentioned briefly after SAPPHIRE NOW, On the 28th of June 2016, Nucleus Research also recently published a summary paper that indicates 9 out of 10 Clients from the 40 SAP Clients they interviewed (within a research pool of ~ 200+ ERP engagements) don’t plan to deploy S/4 HANA in the near future, a link to this item is referenced later in this section.

However …

For larger or more complex existing SAP Business Suite / ERP / ECC 6.0 / SAP NetWeaver Enterprise clients a functional analysis and/or re-mapping of the “existing” and “to be” business process into SAP S/4 HANA Enterprise application template is then required.

This itself is typically a non-trivial exercise which can take many weeks or even months of effort, even if the latest S/4 HANA SAP custom code compatibility inspection tools are employed.  

This also assumes that conversion to this new “Read Optimized” SAP S/4 HANA application template is viable and practical in roll out terms and that the required functionality is both available and stable.

It also assumes that the required remediation is affordable (in time, IT opportunity cost & resources, roll out and SAP HANA platform terms vs alternative strategic IT investment strategies), which then swings back in a full circle back the Choice A Vs Choice B in the first diagram.

In my experience client CIO, CTO’s and/or Chief Enterprise / ERP or Data Architects I’ve spoken to are mostly adopting either Choice A vs Choice B (or a wait and see strategy), with an objective to more effectively meet intense business pressure for more rapid returns from IT investments, value delivery and a faster ROI.

In my humble view, the prior days of monolithic SAP / ERP roll outs are simply drawing to a close.

Re-Integration of prior SAP IS (Industry Solutions) into the SAP S/4 HANA Enterprise digital core.

A number of the prior SAP IS (Industry Solutions), like the SAP IS Retail and/or AFS (Apparel and Footwear) solution are now being re-integrated in data structure, table and functional terms back into the simplified new S/4 HANA Enterprise Management “Digital Core”.

In the case of the Retail Industry solution with phased S/4 HANA Enterprise based functional deliverables beyond “Simple Finance” planned for the SAP S/4 HANA 1611 Release in Q4 2016, followed be further SAP S/4 HANA Enterprise Management Retail hybrid retail / distribution functionality in Q4 2017 etc, in effect the industry aligned, functional delivery has taken a ~ 24 month rain check to be re-engineereddata onto a SAP HANA “read optimized, columnar in-memory platform”.

Refer to the SAP S/4 HANA Retail Roadmap/s (SAP Service Market Place ID Required)

It also assumes that this is a key, strategic forward IT investment priority and focus area, we will come back to the strategic aspect of this particular question a little bit later.

Then naturally you will consider and review very carefully if a new “read optimized” S/4 HANA Digital Core and revised application template that more naturally aligns to a SAP HANA columnar (only) data platform capability with the required revised, updated SAP Basis / database and IT platform skills, limited SAP HANA platform choice/s, change and release and/or Cloud (HEC) or on S/4 HANA on / off premise hybrid cloud deployment requirements, prerequisites and options that this implies.

In the recently updated SAP Nation 1.0 > 2.0 book by Vinnie Mirchandani it mentions the Director of IT, Andre Blumberg at CLP Group (a large Hong Kong headquartered Asia Pacific SAP Utility Client) takes an engineering like approach to the evaluation of new IT technologies like SAP HANA, where they then found the TCO (Total Cost of Ownership) would actually be significantly higher with SAP HANA, not lower as claimed in multiple SAP HANA sales and marketing presentations at Sapphire and TechEd.

This was / is consistent with prior SAP platform TCO/TCA analysis I completed for a ~ $75Bn (50-55Bn EU) Global CP company that was already running a virtualized, tiered, consolidated and standardized SAP platform strategy (over DB2 and System p ) in the form of an “at scale” IBM SAP Private Cloud where a switch to a SAP HANA strategy resulted, when modeled over four regions and four SAP workloads / landscapes (ECC, APO/SCM, BW and/or CRM) using a common tiered, virtualized tiered storage strategy resulted in a 1.5 to 1.6 times increase in SAP platform TCA (Total Cost of Acquisition) .

It also added a further 12-14,000 commodity Intel Cores, which in turn would have forced a significant and costly ~ 3 Mega Watt power increase in each of their two Global Data Centers running directly contra to their green IT, and Data Center sustainability KPI’s.

It’s also worth mentioning to put things into perspective this single Enterprise has more DB2 SAP strategic instances (SIDS’s) in production and non-production than productive S/4 HANA deployments globally (at 600+).

Today, unfortunately SAP are indicating you no longer have an SAP “AnyDB” choice for their new S/4 HANA Digital Core, where today it’s clear that SAP’s S/4 HANA platform strategy is to offer a rdbms choice of one (some SAP Clients might say this equals none) over the prior multiple SAP NetWeaver / ERP 6.0 platform PAM (Product Availability Matrix) defined AnyDB and supported OS/DB permutations and combinations, choices offered before.

Please refer to my prior From Walldorf to West Coast ? S/4 HANA blog on the LinkedIn CIO forum:

The strategic SAP point of view expressed at SAPPHIRE NOW 2016 and/or prior SAP TechEd’s in Q4 2015 is that by restricting the prior AnyDB SAP platform choice, we can deliver new “in-memory” SAP HANA enabled innovations and converged OLTP / OLAP functionality significantly faster.

It is also likely that this helps SAP SE to significantly reduce their prior SAP NetWeaver / ERP 6.0 platform application regression testing costs, largely at the expense of forcing a SAP S/4 Digital Core HANA based platform change for many mutual, IBM SAP large or medium Enterprise clients.

Some more experienced and possibly more cynical longer term IT folks might be forgiven for then suggesting that we seem to then have a case of the SAP HANA columnar rdbms SAP platform “technology tail” wagging the SAP ERP “business application dog”.

Effectively the new SAP HANA rdbms technology choice requires and forces a new “revised read optimized” columnar rdbms SAP S/4 HANA Digital Enterprise based business application model” including things like the new Universal Journal.

Personally I see limited solution or technology benefit in running an existing customized SAP read / write optimized SAP Business Suite template in a Suite on HANA (SoH) platform configuration.

For me at best this is a basic SAP application and platform technology platform mismatch or rather compromised “halfway house”.

Indeed at SAP Sapphire Now 2016, in one key note it was very briefly mentioned SAP CDS HANA “Compatibility Views” would only be supported for a limited period of time (I understand this is currently to 2020) vs for example the support of existing SAP NetWeaver 7.40 and/or 7.50 SAP CDS (Core Data Services) functionality over DB2 10.5 LUW until 2025.

In the latest “Your path to S/4 HANA” brochure distributed at SAPPHIRE NOW 2016 it mentions SAP SE’s very significant investment and commitment to their invention of “the most disruptive pure in-memory technology “SAP HANA” to bridge the gap between prior often separate transactional and analytical platforms.

As mentioned before, recently on June 28th 2016 Nucleus Research have published a report that summarizes the output from a survey of ~ 40 Enterprise IT / SAP Clients with respect to their SAP S/4 HANA adoption plans.

In this report, whilst recognizing it’s a relatively small sample size (40) that was supplemented by research data from a further recent 200 client ERP evaluations, a significant number of clients indicated that they had no near term plans to adopt S/4 HANA (9 out of 10). In my view this is fairly profound input for SAP SE to consider.

BOSTON–(BUSINESS WIRE)–A 60 percent majority of SAP SE (NYSE: SAP) customers wouldn’t buy SAP solutions again according to a new analysis by Nucleus Research. And in SAP’s core market of ERP, nine out of 10 customers say they are not considering a future investment in SAP’s S/4HANA solution.
Previously one of the key reasons why Enterprise Clients selected SAP ERP / NetWeaver solutions was the ability to effectively integrate a more open application platform with existing IT platform choices, IT operational investments & skills to minimise change and risk at the platform vs business process into SAP / ERP application level (which is tough enough to successfully deliver in its own right for large scale ERP business process change and IT phased project deliverables).

Is HTAP actually achievable and desirable at scale? – The answers is that it really depends.

I have a personal view, that whilst it may be practical and/or desirable for small and/or medium enterprises the implied SAP HANA HTAP (Hybrid Transactional Analytical Processing) strategy that is being strongly promoted by SAP SE as a SAP HANA Landscape complexity / TCO Reduction strategy, this is likely not realizable or not even desirable in many large scale Enterprise clients scenario’s, in particular clients with mixed ERP / “Systems of Record” platform portfolios.

In particular where logically for a large Enterprise IT Client, their EDW (Enterprise Data Warehouse either physical or indeed increasingly logical data warehouses) typically consolidates multiple sources of “Systems of Record” / ERP data in addition to increasingly absorbing consolidated information, often in “Distilled SQL form” from external Big Data sources (typically Hadoop / HDFS, MapReduce or Spark / HDFS based).  Some folks are now calling these Hybrid Data Warehouses or Data Lakes.

The key challenge being with Data Lakes and/or Data Reservoirs, if not very carefully managed and governed they can quickly turn into “Data Swamps” pooling untrusted information of uncertain heritage and accuracy being into the Data Lake / Reservoir as a “large bucket” of data.

Hence I would strongly suggest that the middle grey layer of information and data governance, movement, master and meta data management in the diagram below is a rate determining critical success factor for many Enterprises, in addition to the ability to virtualize or federate queries with appropriate throughput in the Hybrid or “Logical Data Warehouse” Vs building prior often monolithic EDW’s.

This typically has to be combined with a controlled and managed insight into action strategy that is typically expected and/or often required by the combination of key line of business executives, and/or IT users.

Personally I prefer to use what I call the “Two Triangles” data architecture, it reminds me of the production of a fine Scottish single malt whisky using local filtered water, malted barley, mash tuns and successive distillation processes, copper and brass stills and the subsequent storage, selection, leveraging highly skilled blending and/or and subsequent consumption or aging in high quality oak barrels.

With the quality of the end whisky produced being totally dependent on an optimal combination of proven skills and capabilities, the quality of the source ingredients, combined in logical proportions with the appropriate distillation asset investments, retention and aging periods.

Translating this into a Big Data scenario for sure also the quality of the end “Insight into Action” product depends on the quality of the data input, the required capital investments and the accumulated Business Analytics skills involved (say the availability of experienced Business Intelligence, Analytics SME’s and/or Data Scientists) as summarized below:

Two Triangles Screen Screen Shot

If a broader Data Lake or Reservoir strategy is of interest to the blog readers, please refer to the following excellent Paper –  Governing and Managing Big Data for Analytics and Decision Makers.

Anyway, I digress, back to the topic in hand, hence for me, I still believe in the logical separation of at scale Enterprise OLTP / Transactional and/or at scale EDW / OLAP Analytical workloads as follows:

Does HTAP Makes Sense for you ? v2

Indeed SAP SE are also currently re-positioning SAP BW (Business Warehouse) from its prior typical role of a SAP ERP aligned operational and/or transactional reporting platform into a role as a SAP BW + HANA based EDW with HANA live / FIORI Analytics user interface used for transactional and/or operational reporting.

This assumes of course the required business value content and reports have previously been defined and delivered (which is not automatically the case, a recent SAP HANA Live client indicated in comparison with SAP operational reporting tools from vendors like EveryAngle).

A Gartner point of view on this topic can be found in Gartner Paper – G0027727, via your Gartner subscription,  Hype Cycle for Information Infrastructure, 2015, published 13th August 2015

“Almost all new infrastructure technologies emerge into market productivity as incremental solutions (and as such can persist as stand-alone solutions).”

Gartner source / copyright respected – please refer to original for full source details

Gartner Report Information Infrastructures 2015 heading

Gartner HTAP Point of View

What happens if you have just deployed an existing broad, deep, Customized SAP NetWeaver ERP 6.0 platform ?

Now we firmly get to the most complex and difficult scenario, where you have an existing often deeply customized mission and business critical, broad and deep SAP ECC / NetWeaver / Business Suite deployment that has just been rolled out at a regional or global level (with a standardized but often deeply customized SAP Business Suite / ECC 6.0 template).

There is also a point of view that the considerable “marketing hype” associated the SAP HANA “in-memory” columnar database will largely be forgotten in 3-5 years as alternative less disruptive evolutionary “in-memory columnar” rdbms solutions have become available, enabled and been optimized with SAP BW 7.0x and/or more recently SAP BW 7.3 > 7.4 including Flat InfoCubes.

In Section 2 consequently – Let us briefly review the benefits of DB2 BLU and/or DB2 10.5 for SAP workloads before circling back around in Section 3 to Strategic IT Investment and Innovation choices – Back to the Choice A or B as below, where I believe many clients will select Choice A to accelerate strategic Open Source, API / IoT enabled “connected economy” investments in the “Green Arrow” path as below:

Simplified Choice ?

Disclaimer – This blog represents the authors own views vs a formal IBM point of view

The views expressed in this blog are the authors and do not represent a formal IBM point of view.

They do represent an aggregate of many years (20+) of successful ERP / SAP Platform deployment and IT strategy development experience that is supplemented with many hours of reading, respective DB2 and/or SAP HANA Roadmap materials and presentations at various user conferences and/or user groups, in addition to carefully reading input from a range of respected industry / database analyst sources (these sources are respected and quoted).


IBM DB2 BLU and/or DB2 10.5 Optimization for SAP – Platform Evolution vs Revolution

In blog section 2 consequently – Let us briefly review the benefits of DB2 BLU and/or DB2 10.5 for typical SAP NetWeaver Business Suite workloads.

In blog section 1 we highlighted the complex choice IBM SAP Enterprise IT Clients face if you are already happily running often customized SAP Business Suite / SAP NetWeaver over DB2 (with z, p, i and/or LUW) over your preferred and/or virtualized IBM SAP platform choice (z, p, i Series, Linux, VMware ESX, widows / Intel + DB2 LUW etc).

Then very careful analysis of the TCO, Functional and cost / benefits and risks associated with SAP HANA with SAP BW and/or Suite on HANA (SoH) or indeed starting again with a new S/4 HANA Digital Core S/4 HANA Enterprise Management (at the 1511 release vs the prior 1503 code path before) is then required.

This helps to ensure the claimed or indicated benefits actually align to your business and IT priorities over SAP SE’s natural desire to increase their share of your IT spend, in terms of the required HANA rdbms license, support and/or HANA remediation consulting revenues vs prior AnyDB platform choices.

This has to considered and balanced vs the continued deployment of viable, mature and proven SAP IBM DB2 NetWeaver optimised AnyDB alternatives that have and continue to be progressively developed and optimized over many years jointly with SAP DB2 development teams in Walldorf, IBM’s Boeblingen and/or the IBM’s DB2 Development Labs in Toronto, Canada.

Unfortunately there are no joint development labs in the Scottish Highlands, never mind !

(Whilst also not forgetting that 2 or more IBMers invented relational database platforms, many, many years ago following on from IBM’s IMS which was used by NASA for the Apollo programme etc) .

In particular where Client and/or more recent joint IBM DB2 / Systems Group Lab testing indicates for more complex and concurrent SAP BW Analytical (OLAP) workloads, IBM’s DB2 10.5 BLU (and/or DB2 11.1 with BLU + MPP – Massively Parallel Processing which is now certified with SAP BW) often match or significantly exceeds the throughput of SAP HANA with SAP BW for OLAP / Complex SQL BW Reporting workloads with less than half of the configured system memory.

Also typically using significantly fewer multi-threaded cpu cores, whilst providing rapid, incremental and non-disruptive speed to value without having to re-engineer or optimize the Clients SAP BW configuration and/or Business Objects (or Cognos etc) reporting tiers.

With SAP BW 7.0x (and above up to BW 7.4) and DB2 10.5 BLU, this is normally combined with a relatively simple, quick and largely non disruptive targeted row to columnar DB2 SAP BW table conversion using the latest version of the DB6CONV tool typically targeted at the SAP BW reporting tier (InfoProviders, InfoCubes).

DB2 10.5 BLU includes enablement and optimization of SAP HANA derived “Flat InfoCubes” support at SAP BW 7.40 (with SAP NetWeaver 7.40 or 7.50) with DB2 10.5 FP5S or above.

This diagram below indicates the relative speed up typically observed between DB2 10.5 LUW with SAP BW in a row relational form, then in a columnar “in-memory” organization and/or columnar “in-memory” with SAP BW “Flat InfoCubes” (at BW 7.40) or a representative sample set of BW / SQL queries and reports.

BLU Relative Throughput Flat Infocubes
Over a range of queries excellent throughput improvements are observed with relatively modest increases in the allocated DB2 memory (GB Ram) and server CPU core capacity.

Personally I’m not a great fan of the prior x100 or x1000 HANA speed up claims that seemed to be features of prior SAPPHIRE and/or SAP TechEd conferences with respect to SAP HANA.

Whilst these maybe true for individual queries when comparing older row based rdbms systems (often on prior generations of hardware) with SAP HANA on the most current Intel hardware, these from my PoV are often “apples & pears” comparisons that make good marketing charts, but are likely not so representative for many clients real life mixed SAP BW OLAP reporting and/or batch / ETL (Extract Transform Load) workload scenarios.

The table above simply highlights the benefits of leveraging prior proven and mature DB2 LUW (Linux Unix Windows) rdbms technology, combined with proven query optimization and/or buffer pools deployed to leverage a columnar, autonomic (automatic to me & you) modern tiered data platform.

With DB2 v11.1 we also now combine prior proven DB2 Data Partitioning Features (DPF) that effectively manages, distribute and optimise both queries and data placement over a scale out n+1 architecture for the very largest clients (10-100’s of TB of adaptively compressed DB2 SAP BW data) to enable DB2 BLU with MPP (Massively Parallel Processing) – This also fully leverages prior DB2 BLU “in-memory” columnar and prior SAP BW 7.x optimizations, with an expected GA and/or SAP certification in Q3 / Q4 2016.

For folks that are interested a summary review of the DB2 11.1 LUW “Hybrid Cloud enabling” capabilities can be found in the following paper by Philip Howard from Bloor Research and/or a summary from a recent DB2 11.1 announcement web link.

Insert web link here – Re Bloor Research

A link to a news wire on the DB2 v11.1 announcements is enclosed, or for the IBM web site formal announcement below.

IBM Targets Developers with Powerful In-Memory Database in the Cloud

DB2 on Cloud makes hybrid cloud development easier

Now let’s consider the OLTP / Transactional workloads vs OLAP / Analytical scenarios.

The next statement may sound relatively harsh, but in many cases is true in the cold light of day when the relative costs, risks and real benefits of migrating an existing “read / write” optimized customized SAP NetWeaver ABAP / SAP ECC OLTP (and/or prior BW 7.X OLAP) template to a “Suite on HANA” and/or a new S/4 HANA Digital Core are considered, it may or may not stack up in cost / benefit terms exactly as previously suggested and marketed by SAP SE.

This is naturally back to my technical and solution architect “it depends” disposition, we also have to consider the relative strategic competitive and business into IT benefits of various IT and strategic platform investment choices (S/4 HANA Digital Core, IBM’s Watson / IoT, Bluemix etc) for competitive advantage over prior COTS or packaged application deployment strategies.

The next key question becomes is your business aligned IT investment priority focussed on front office IT enablement and differentiation in the System of Engagement, Systems of Insight, Systems of Innovation / IoT area, for competitive advantage vs remediation of existing customized SAP ERP NetWeaver “Back Office” “System of Record” configurations to enable what might initially appear to be a commercially driven SoH OS/DB SAP HANA platform migration ?

For me individually I view SoH (Suite on HANA) as essentially a “zero sum” game (not an ideal combination of the two worlds) in real delivered IT benefits terms, where for example IBM DB2 SAP Clients can already fully leverage the throughput, scalability, TCO reduction, adaptive data compression, many years of SAP DB2 optimizations and maturity of DB2 over SAP ERP 6.0 / SAP NetWeaver including support through to 2025 with SAP NetWeaver 7.40 and/or 7.50.

For these Clients there are also clear benefits from the SAP Core Data Services (CDS) HANA aligned application database functional push down optimization enablement over DB2 10.5 (and above) with SAP NetWeaver 7.40 and 7.50 in addition to FIORI Transactional application user interface enhancements (vs Fiori analytical optimization with, for BW + HANA  and/or SAP S/4 HANA).

In the following diagram we describe examples the detailed mapping of aligned IBM DB2 10.5 functionality to SAP Core Data Services (CDS) aligned rdbms function calls and optimizations:

DB2 CDS 10.5 Optimisations

I believe a number of mutual IBM SAP Enterprise clients will decide to sustain a “functionality stable” SAP NetWeaver ERP 6.0 “System of Record” Core (ECC, BW, APO/SCM, PI, PLM etc).

Essentially they will adopt a let’s “wait & see” strategy on future S/4 HANA Digital Core adoption.

A number of them may also decide to switch from a prior SAP 1st back office (SoR) strategy to selecting and integrating “Best of Breed” front office alternative SaaS / Cloud / IoT solutions from vendors like (CRM), NetSuite (Financials / ERP) Workday (HR) and/or Anaplan (S&OP), IBM’s own Watson IoT / Bluemix etc.

Where this “Best of Breed / SaaS / Hybrid Cloud strategy is then integrated back to this stable SAP ERP 6.0 / SAP NetWeaver core via SOA (service-orientated architecture) API (application programming interfaces) standards, leveraging existing deployed Enterprise Application Integration (EAI) messaging / integration buses or indeed various “Application Integration as a Service” (AIaaS) offerings (which may include for example SAP PI/PO or IBM’s WBI, IBM’s WebSphere CastIron * / IBM’s Data Power etc).

* Indeed IBM’s WebSphere CastIron integration appliance was often previously used to integrate SuccessFactors with SAP ERP NetWeaver solutions prior to SAP’s acquisition of SuccessFactors and is also typically used to help integrate via a “Drag & Drop” simplified template (TIPs) driven integration strategy CRM solutions with SAP NetWeaver / ERP 6.0 solutions.

Some time ago, in February 2012 Forbes published the following item on Cloud computing:

6 Shining Examples of Cloud Computing in Action, Joe McKendrick.

Cloud computing means more than simply saving on IT implementation costs. Cloud offers enormous opportunity for new innovation, and even disruption of entire industries.

Which provides a natural segue into my section 3 topic

Section 3 –SAP NetWeaver Core + Best of Breed / SAAS Strategic IT Alternative Investment Choices