Latest blog entries Mon, 27 Feb 2017 15:37:00 +0000 Joomla! - Open Source Content Management en-gb Are Technology Shakeups in Store for 2015?

Before I start with my predictions, let me explain what I mean by a prediction. I believe that predictions should not be about the end of a technology cycle but the timing for when a issue begins to gain traction that will result in industry shifts.  As I pointed out in my book, Smart or Lucky? How Technology Leaders Turn Change Into Success (Josey Bass, 2011), important industry initiatives and changes usually require decades of trial and error before they result in significant product and important trends.  So, in my predictions, I am pointing out changes that are starting.

a know that the rule is that you need to come up with ten predictions when a new year is about to start. But I decided to break the rule and stick with seven. Call me a renegade. I think that we have a very interesting year taking shape. It will be a year where emerging technologies will move out of strategy and planning into execution. So, I expect that 2015 will not be business as usual. There will be political shakeups in both IT and business leadership as technology takes on an increasingly more strategic role. Companies need to know that the technology initiatives that are driving revenue are secure, scalable, predictable, and manageable. While there will always be new emerging technologies that we take us all by surprise, here is what I expect to drive technology execution and buying plans in the coming year.

1. Hybrid cloud Management will become the leading issue for businesses as they rely on hybrid cloud.

It is clear that companies are using a variety of deployment models for computing. Companies are using SaaS, which obviously is a public cloud-based service. They are using public cloud services to build and sometimes deploy new applications and for additional compute and storage capacity. However, they are also implement private cloud services based on the level of security and governance. When cloud services become commercial offerings for partners and customers, economics favors a private cloud. While having a combination of public and private is pragmatic, this environment will only work with a strong hybrid cloud management service that balances workload across these deployment models and manages how and when various services are used.

2. Internet of Things (IoT) will be dominated by security, performance, and analytics. New players will emerge in droves.

Internet of Things will be coming on strong, since it is now possible to store and analyze data coming from sensors on everything from cars to manufacturing systems and health monitoring devices. Managing security, governance and overall performance of these environments will determine the success or failure of this market. Businesses will have to protect themselves against catastrophic failure – especially when IoT is used to manage real time processes such as traffic management, sensors used in monitoring healthcare, and power management. There will be hundreds of startups. The most successful ones will focus on security, management, and data integration within IoT environments.

3.Digital Marketing disillusion sets in - it is not a substitute for good customer management.

Many marketing departments are heavily investing in digital marketing tools. Now corporate management wants to understand the return on investment. The results are mixed. First, companies are discovering that if they do not improve their customer care processes along with digital marketing software and processes, digital marketing is useless. In fact, it may actually make customer satisfaction worse since customers will be contacted through digital marketing services but will not get better results. This will result in a backlash. Unfortunately, it may be the messenger who is blamed rather than the culprit – poor customer care.

4.Cognitive computing will gain steam as best way to capitalize on knowledge for competitive advantage.

The next frontier in competitive differentiation is how knowledge is managed. The new generation of cognitive solutions will help companies gain control of their unstructured data in order to create solutions that learn. Expect to see hundreds of start ups emerging that combine unstructured data management with machine learning and statistical methods, advanced analytics, data visualization, and Natural Language Processing

5.IT will gain control of brokering and managing cloud services to ensure security and governance.

For the past five years or more business units have been buying their own public cloud compute and storage services, bypassing the IT organization. Many of these organizations were frustrated with the inability of IT to move fast enough to meet their demands for service. When these departments were experimenting with cloud services, expenses could easily been hidden in discretionary accounts. However, these public cloud services move from pilot and experimentation to business applications and services. There are implications for cost, governance and management. As often happens when emerging technology becomes mainstream, IT is being asked to become the broker for hybrid cloud services.

6. Containerization and well designed APIs are becoming the de facto method for creating cross platform services in hybrid computing environment.

One of the benefits of a services architecture is that it is possible to truly begin to link computing elements together without regard to platform or operating system. The maturation of container technology and well-designed APIs are going to be a major game changer in 2015. These issues of containers and APIs are linked together because they are focused on abstraction of services and complexity. These abstractions are an important step towards moving from building applications to linking services together.

7.Data connectivity combined with business process emerging as biggest headache and opportunity in hybrid.

Data connectivity and business process issues are not a new problem for businesses. However, there is a subtle change will major ramifications. Because business units tend to control their own data both on premises and in SaaS applications, it is increasingly difficult for business leadership to create a unified view of data across a variety of business units. Without being able to bring data and process across silos puts businesses at risk. This complexity will emerge as a major challenge for IT organizations in 2015.

Read More]]> (Judith Hurwitz) Vendor Strategy Tue, 16 Dec 2014 17:24:48 +0000
Oracle Makes a Business Case for Cloud Computing


Oracle is stepping up its move to the cloud, positioning the Oracle Cloud public cloud as an engine for growing its overall enterprise business.

As more companies move workloads into hybrid clouds, and competition intensifies, Oracle has decided to play to its strengths in security, availability and workload performance. It has taken its own approach by leveraging its engineered systems, running in Oracle Cloud data centers – or in customer sites – as a differentiator in competing with other CSPs. 

Oracle knows it is competing with other cloud services that got into the market much earlier than it did: Amazon Web Services and Microsoft Azure. It also knows that Google Cloud Platform and IBM SoftLayer are working to grow share in the enterprise-focused hybrid cloud space. However, Oracle believes that large-scale migration of enterprise workloads is still in its early stages, giving it a large opportunity among customers planning to move enterprise workloads and business applications to public cloud providers.

In this competitive environment, Oracle is going directly to big customers, worldwide, with its Oracle CloudWorld events, recently held in New York and Seoul. It is positioning its deep software portfolio of Oracle databases, middleware and enterprise applications software as cloud service differentiators. In 2016, other CloudWorld events were held in China, India and Mexico.

What’s New

Oracle phased in its move to the cloud – working first on SaaS, and PaaS, before introducing a set of IaaS services in 2016. Its cloud revenue is growing, as it reported in its quarterly financials. Now, Oracle is still adding to its cloud services portfolio: As announced in January, Oracle introduced bare-metal-as-a-service, so that customers can run workloads on the Oracle Cloud service, in place of on-premises hardware systems. It is also expanding Oracle Cloud capacity by adding three more data centers – in Virginia, London and Turkey. That will bring the total number of Oracle Cloud data centers worldwide to 25, covering all time zones and major geographies.

Oracle’s focus on enterprise feature-sets is positioned to pay off in hybrid cloud and public cloud, as it works to grow share in the rapidly expanding cloud services marketplace. Oracle’s cloud strategy is maintenance of a consistent computing environment, with the same Oracle stack running on-premises, at customer sites, and inside the Oracle Cloud. Oracle calls this an “integrated cloud” stack engineered to work on-prem or off-prem.


Oracle CloudWorld in New York

At New York’s CloudWorld event on Jan. 17, Oracle CEO Mark Hurd made the business case for CXOs adopting Oracle Cloud. Beyond the technology, he noted, business managers want to learn more about the business value of adopting cloud services. Cloud adoption supports IT simplification and workload consolidation, as enterprise datacenters are combined, and workloads migrate, either on-premises or off-premises. Oracle intends to play in both spaces, using the same Oracle software stack – while leveraging Oracle engineered systems in the Oracle Cloud.

Customers speaking the New York CloudWorld event included New York City’s CTO – and IT executives from MetLife, Thomson Reuters, ClubCorp and Grant Thornton, among others. They cited the flexibility they have with cloud services, to test new applications, to deploy more instances quickly, and to pay for capacity, as it is used.


Making the Business Case for the Oracle Cloud

The depth of Oracle’s commitment to cloud computing can be seen in Oracle’s investment levels in cloud technology, its re-write of Oracle Fusion applications for cloud-based workloads and its deep applications portfolio. All of that says that Oracle will be a long-run provider in the hybrid cloud market – and that it plans to replicate its earlier successes in enterprise database and enterprise applications with enterprise plays in the cloud computing marketplace.

The leading arguments in Oracle’s enterprise case for the cloud include:

  • Data-centric focus on cloud computing, starting with hybrid clouds linking on-prem and off-prem systems. Data migration can, and often does, end up with entire applications running on the Oracle Cloud. Oracle’s offering include many analytics and data-management capabilities.
  • Leveraging Oracle engineered systems for on-prem or off-prem (with OracleCloud public cloud) use. This allows Oracle to manage the systems in the same way, regardless of location within a hybrid cloud.
  • Leveraging Oracle enterprise applications. Oracle has a portfolio of hundreds of cloud-based and on-premises enterprise applications, such as ERP, HCM, SCM and others. Oracle acquired dozens of companies over the last 20 years – including PeopleSoft, J.D. Edwards and others, growing its inventory of Oracle Applications. It then spent nearly 10 years re-engineering them into the Oracle Fusion Applications, for both on-premises and SaaS-based deployment, based on Java-enabled code, to link apps together and support unified management of applications.


Business-Centric Services

In terms of Oracle’s software products and services, there is a multi-faceted strategy to position Oracle as a unifying element of hybrid cloud deployments. Oracle’s positioning includes the following:

  • Positioning a broad inventory of packaged business applications, including Oracle’s on-premises E-Business Suite, as well as applications acquired by Oracle from enterprise software companies (e.g. PeopleSoft, J.D. Edwards) – and now adapted for hybrid cloud.
  • Appealing to longtime Oracle Database customers who plan to move at least some of their mission-critical workloads and data to hybrid cloud.
  • Outreach to new customers, including cloud-native developers and enterprise developers, and working to gain competitive wins in IT organizations that have multi-vendor data centers that run Oracle.



Cloud adoption is accelerating in many organizations, across the board. For Oracle, it will be important to extend its reach to new audiences beyond the Oracle installed base, emphasizing its enterprise-centric delivery for business databases and business applications with OracleCloud. We expect Oracle to continue its outreach to new cloud-services customers, through Oracle CloudWorld Events, Oracle OpenWorld and through direct and partner sales efforts, throughout 2017.




Read More]]> (Jean Bozman) Cloud Computing Tue, 07 Feb 2017 16:03:07 +0000
How Can You Drive #PositiveSecurity? I’m a big car nut so I wondered if there are any parallels between the evolution of cars and IT security. Seven years after the first automobile was produced by Karl Benz in 1886, electronic security started when the wireless telegraph, the first network was hacked.

While both technologies were have born in similar eras, their paths quickly diverged. Automobiles evolved at a fast pace over the years because consumers and government demanded improvements. Performance increased, costs decreased, and safety got a lot better.

In contrast, security’s progression was largely motivated by the risk - prevention from attacks, detection of current attacks, and remediation of past attacks. Security was always weighed against perceived cost of the risk. In other words, you don’t want to spend more on security than the attacks cost you. Comfort was completely alien to security until recently.

Today, security has changed. Rather than simply thinking about detection of attacks, organizations are now beginning to focus on how users will interact with security measures.  This is what I am calling Positive Security. . Positive Security in the ability to build security into products so that it benefits the user so that they are protected from intrusion. 

How did negative security models work?

How is Positive Security different than the old security models?  The traditional model of security focused on the risk and the cost of security strategies and not on how this would affect users. In fact, it was a commonly-held belief that the worse the user experience, the better the security. This led to users hating and circumventing security protocols.. 

The current model of security is slowly beginning to understand the need to improve the user experience. However, security models are still designed to focus on mitigated risk and satisfied regulatory compliance. This change is being driven by business leaders who are demanding that security shouldn't hinder business goals. They expect that  a secure user experience should become easier and faster than with the old security model. That change meant that incident responses became more proactive. However, process is slow.  Even progressive companies aren’t quickly incorporating those lessons learned from past incidents into future security strategies. 

As employees expand their use of cloud applications consumed with their own devices, the focus on the user experience is expanding.  There is a disconnect between the old security models and new models required for the new cloud-dominated era. As technologies change, a new model must evolve to support continuous change.

How do you prioritize secure business innovation?

Balance Risk and Costs for Positive SecurityThe most important implication of Positive Security it that it puts the user experience front and center.  This means that secure business innovation must be dynamically balance both risk and cost. Positive Security impacts organizations at all levels, including:

  • For senior business unit leaders, security must contribute to revenues, profits, customer satisfaction, employee productivity, compliance, and business innovation.
  • For IT leadership, security needs to be able to predict the seriousness and urgency of potential threats, prioritize mitigation strategies, analyze the negative impact of fixes and patches on productions, and suggest courses of action.
  • To support end users and customers, security solutions need to monitor users’ behaviors, transactions, and interactions in near real-time to avoid accidental, mischievous, or malicious activities. Actions can be manually or automatically invoked so that security managers can disable access, step up authentication, or invoke detailed tracking for further analysis. In some cases, "normal behavior" over a long period of time might even cause systems to step down authentication and other security measures to improve the user experience.

What #PositiveSecurity solutions already exist?

As listed below, many current security technologies fall into the positive category. Therefore, evolving towards #PositiveSecurity does not require the rip & replace upgrades of older solutions. Threat Intelligence and Analytics, however, are a new and key distinction, as explained in the next section

  • Endpoint (IoT, Mobile, and systems)
  • APT (Advanced Persistent Threat) Defense
  • Security SaaS (Software as a Service)
  • MSSP (Managed Security Service Providers)
  • MFA (Multi-Factor Authentication) with biometrics
  • PAM (Privileged Access Management)
  • Threat Intelligence and Analytics
  • UBA (User Behavior Analysis)
  • NGFW (Next Generation Firewall)
  • Message Scanning
  • DLP (Data Loss Prevention)
  • Encryption and Tokenization
  • VM (Vulnerability Management
  • White Listing
  • Single SignOn (SSO)
  • DDOS (Distributed Denial of Services) and DNS (Domain Name Services) Defense
  • SIEM (Security Incident/Event Management)
  • Cloud Security
  • Web Security
  • Applications Security

What really separates #PositiveSecurity solutions from negative ones?

The predictive capabilities of threat intelligence and analytics enhance current security technologies’ ability to see early Indicators of Attack (IoA) around the world and take preemptive and prescriptive measures to reduce an enterprise's vulnerability. Using machine learning and artificial intelligence, #PositiveSecurity systems alert on Indicators of Compromise (IoC) based on anomalous deviations from normal patterns. Moving from reactive to proactive to predictive is one crucial driver towards #PositiveSecurity. The other driver requiring not just IT, but all the stake holders to embrace #PositiveSecurity’s core collaboration between UX, risk, and cost.

What would happen if cybersecurity was a positive, instead of a negative?

In an ideal world where security actually improves business operations, consumers would shop for the best deals without worrying about web site reputation, credit card theft, stolen passwords, and identity theft. Employees could access company applications and data from anywhere in the world, on any device, across any network quickly and securely. Businesses could efficiently and cost-effectively and selectively make services, applications, and data freely available to employees, customers, prospects, suppliers, resellers, contractors, and regulators without performance penalties, costly integration, or disruption of production processes. Consumers and businesses could continuously optimize IoT processes, systems, and analysis without fear of hijacked devices, theft of analytics, or management disruption.

Granted, this is an optimistic view, but a positive outlook certainly produces better outcomes than negative ones. Moreover, the precursors of positive security solutions already exist. There are innovative businesses that have adopted this view and are benefiting from it. We hope to bring you these stories in the future so your company can benefit from #PositiveSecurity.

Read More]]> (Chris Christiansen) Security Thu, 02 Feb 2017 19:13:25 +0000
Welcome to Digital Transformation: Dragging Retailers to Adopt Emerging Technologies

I had the opportunity to attend the National Retail Federation Conference this year. Attending sessions and meeting with a variety of retailers and technology vendors I came away with some observations about the requirements for retailers in the future to be successful. My overriding observation is that retailers who are successful in the future will be those companies that are able to get creative in their use of technology. Retailers need to focus on the intersection of cloud, analytics, customer experience, and security. It is a difficult challenge. First, margins for retailers are quite low so that investing in technology can cause heartburn. Second, retailers are used to buying packaged software with packaged reporting and therefore aren’t ready for technology innovation and change.

Here are five observations about technology requirements for successful retailers

One. Retailers need to focus on security. I was fascinated by the lack of discussion about security. In fact, I did not see a single vendor focused on security on the show floor. Without a well-planned security software strategy, it will be difficult for retailers to succeed in a world that is increasingly moving online.

Two. Retailers don’t understand how to build customer loyalty on and off line. The retail market is in turmoil with the advent on online giants such as Amazon. Retailers are struggling with how to provide a great customer experience and intimacy so that customers don’t buy only on price.

Three. Flexible cloud-based solutions will be the key for the future. Technology products for retailers tend to fall into two categories: packaged turnkey solutions that cannot be easily modified and updated as new modular technology advances; or completely customized offerings from systems integrators. Customers need a combination of both to be future proofed. Software as a Service offerings that are configurable are the answer.

Four. Advanced Analytics is a requirement for retail success. I saw a lot of focus on packaged reports rather than analytics. Analytics is difficult to achieve but mandatory if retailers are going to understand how their customer’s buying patterns are changing. Leveraging cognitive computing and machine learning will be imperative and will separate the retailers that are prepared for the future and those that are left behind.

Five. Customization will be the important to the new generation of retailers. Many CEOs of emerging retailers are focused on providing customized offerings to customers. This is especially true for clothing where one size doesn’t fit all. One of the biggest problems for clothing retailers is that they can’t anticipate how well their clothing will fit those consumers walking in the door or ordering online. Technology solutions are the only answer and it is still quite early. Marrying computer vision combined with excellent customer experience and advanced analytics may hold the key to success.

Next Steps

Vendors selling products and services into the retail market need to help management understand the value of emerging technologies. They need to be able to demonstrate that advanced analytics and cloud services can lead to disruption and transformation in a highly competitive market. There is a clear need to educate this market and explain the value of emerging technologies with a step by step approach needed to gain buy-in. Retailers need to think outside the box and understand how they can keep ahead of the competition. It is no longer enough to simply buy a rigid software package. Retailers need to pay attention to the opportunities to beat the emerging retail players with technology weapons.

Read More]]> (Judith Hurwitz) Vendor Strategy Mon, 30 Jan 2017 16:32:44 +0000
Enterprise Clouds Demand Enhanced Security and Availability

Security and availability are top-of-mind IT concerns about enterprise clouds – and transferring enterprise workloads to hybrid clouds.

I’ve said it before, and I’ll say it again – security and availability are not going away just because we’re in the age of cloud computing. Rather, customers must evaluate their on-premises requirements for both security and availability -- and they must take steps to match those levels throughout the hybrid cloud, working together with their cloud service providers (CSPs).

  • Security must be assured for all workloads and data. That can be achieved by means of encryption and user authentication – and by 360-degree planning by IT and business managers that evaluates all aspects of end-user access to data.
  • Availability of data, via the avoidance of data breaches, must be assured, although the technology for achieving high availability (HA) will be different – and more distributed – in cloud computing.
  • All of the largest cloud service providers are including enterprise-level security/availability discussions their presentations and announcements.

Layers of Security

First, let’s look at security. When cloud computing was introduced in the late 2000s, multiple “layers” of security had to be in place before data and applications could be transferred to cloud service providers (CSPs). That is true today -- and enhanced security is now the expectation of customers migrating workloads from the enteprrise data center to the cloud.

Assurances about data locality (e.g., data staying within a geographic region) had to be made – to make sure that data could not be compromised or tampered with if it traveled beyond the bounds of geographic regulations. Today, all of the major coud service providers (CSPs) have cloud data centers  within each major region: North America, EMEA, Asia/Pacific and Japan.

Data access has spurred large amounts of redevelopment and re-writing of older application code. Access could be granted by the user “roles” within the organization, determined by job type – and user IDs required better means of access (e.g., better passwords, biometrics, encryption standards).

The need to replicate data across multiple “regions” of cloud computing posed another challenge for cloud service providers. But data has to be replicated – at least three times – in order to ensure availability of data, even in data fails or is corrupted within any one region or domain. Today, a three-fold replication is the de facto standard -- and sometimes data is replicated more than that. So, data location and data-copying represent another sphere of attention for data security.


SLAs for Availability

Data availability and application availability are the “companions” of achieving security in cloud computing. Security cannot be assured unless availability considerations have been attended to. Today, there are multiple ways to ensure availability, multiple ways to make sure data will be available, with little delay, when needed.

Traditionally, availability was achieved within a single, scalable system – or through failover within a cluster of  servers. Those methods ensured that data was always available on alternate resources (server or storage), in the event of a system failure on any given section of the IT infrastructure. That approach worked well inside an enterprise data center, where specific workloads were assigned to named clusters.

Today, failover has been joined by other computing techniques – cloud-style storage, data replication to multiple “regions” of the cloud and “sharding” of large databases, housing segments of the data across many servers and data stores in the cloud. Sharding is being widely adopted within cloud service provider (CSP) infrastructure – where scale-out databases, running across many server “nodes,” became the norm in recent years. All of these approaches – inside a single system, within a cluster, and distributed across a hybrid cloud – ensure that a full dataset can be recovered, even if specific server or storage devices go offline.

The advent of cloud storage has allowed new types of data storage and availability. Key to effective and efficient cloud storage is advanced systems management software that farms out data to multiple resources. Importantly, advnaced systems management software provides a unified view of data resources across a hybrid cloud. Software-defined storage (SDS) often uses these types of systems management software, running on large numbers of virtualized storage devices.


Where You Stand In the Network Determines Your View of the Data

Hybrid cloud computing, linking enterprise and cloud data centers, is being widely adopted by customers. It is a pragmatic step into the world of cloud computing, while retaining legacy and critical systems within the enterprise data center.

That’s why the largest cloud service providers (CSPs) are increasingly focused on optimizing both security and data availability for hybrid clouds.

Most large enterprises have “inherited infrastructure” and aging business applications – some as much as 20 to 30 years old, or more. Many enterprise customers are actively working to simplify that inherited IT, to consolidate existing workloads to fewer systems “footprints” in their data centers. At the same time, they are identifying the workloads that will move to public clouds – focusing on cloud hosting to contain IT costs.

Importantly, now matter where the data exists, it must be discoverable via unified management consoles for network and database administrators. Having a unified view is essential to efficient, highly secure and available hybrid clouds.


Key Takeaways

Today, it’s a given that much of your data is “living” offsite at trusted cloud sites (e.g. Amazon Web Services (AWS); Microsoft Azure, Google Cloud Platform, IBM SoftLayer and Oracle Cloud, among others. It may also be housed at CSP partner sites, like Equinix, which provide high-speed links between enterprise data centers and public cloud datacenters. All of these providers are taking steps to ensure equivalent, or better, security and availability for hybrid cloud workloads.

Each year, the percentage of all enterprise applications living in the cloud will rise. Some believe that as much as 20%-30% of enterprise workloads are already in the cloud – and that the number will move past 60% over the next five years. These workloads include application development, dev/test, enterprise applications (ERP) and delivery via software-as-a-service (SaaS).

Customers have important decisions to make about security and availability -- and the ways they will achieve higher levels of both security and availability for hybrid clouds. As the migration continues, enterprise customers must think deeply about security and availability requirements. They must explore the security and availability requirements they already have – and then enforce those standards for security and availability by working with public cloud partners.

Awareness of ongoing security and availability requirements represents an industry-wide opportunity for systems integrators (SIs), managed service providers (MSPs) and others working closely with enterprise and SMB customers. They must work to better understand customers’ security and availability expectations – and to support them in new, and innovative, ways.

Read More]]> (Jean Bozman) Cloud Computing Fri, 27 Jan 2017 18:50:38 +0000
IBM Systems' Distributed Computing Initiatives for Hybrid Cloud IBM Systems’ year-end report to analysts was remarkable in the way that it positioned distributed computing, supporting hybrid cloud and analytics, as an important key to the company’s overall systems strategy. It is also vital for growing installations of IBM hardware and software, and expanding into new accounts.

IBM has long been involved in scale-up and scale-out computing, but its brand is often associated with traditional scale-up systems and storage for data centers.

This year-end report showed how the portfolio, as a whole, enables distributed computing and scale-out computing, to support hybrid cloud, analytics and cognitive computing. With this approach, IBM’s scale-out systems – seen in clusters and cloud services – coordinate with scale-up systems, such as IBM z Systems and scalable IBM storage. Overall, this positioning gives IBM ways to approach new customer sets adopting cloud that might not have considered IBM hardware and software before.

As cited by Tom Rosamilia, IBM senior vice president, Systems, three top examples of this portfolio approach for distributed computing are:

  • Cognitive computing through acceleration of workloads, involving Nvidia GPUs, Power Systems built on POWER9 and OpenPower chips.
    • Google has discussed deploying its Zaius systems, based on OpenPOWER and OpenCAPI acceleration, for hyperscale analytics.
  • Blockchain applications delivered on multiple platforms, including IBM LinuxONE systems supporting secure cloud-enabled services.
    • Banks are using LinuxONE systems, blockchain and secure containers.
  • Software-defined infrastructure (SDI), involving all-flash storage in servers and storage devices, and IBM Spectrum storage-management software.
    • Enterprises are using virtualized infrastructure, running on x86 and Power servers in hybrid clouds, and storing data on flash storage.


Hybrid Cloud Deployments

IBM is working to grow its customer set by tapping into end-to-end solutions that tie enterprise data centers – long associated with IBM’s scale-up systems – to cloud-delivered services featuring scale-out systems. These hybrid cloud solutions combine IBM systems platforms with IBM cloud object storage products and IBM Spectrum storage-management software. IBM has provided systems for hybrid clouds for years now, but is emphasizing hybrid cloud solutions based on new and enhanced hardware and software products and services.

It’s worth noting that, many years ago, IBM would have emphasized sales of its own x86 servers as platforms for these hybrid cloud solutions. Given IBM’s 2014 sale of its x86 server group to Lenovo, these cloud-based solutions give IBM another avenue to tap into cloud services running on x86 servers. They can do so through use of IBM’s SoftLayer IaaS services (for x86 and Power), or by using other cloud services provided by CSPs and MSPs. Importantly, they can use IBM distributed cloud-object software for hybrid cloud platforms, which stores and manages data on available resources, spanning multiple data centers.


Cognitive Computing and Analytics

Cognitive computing workloads with HPC (high performance computing) and analytics programs provide more ways for IBM’s Power systems business to grow beyond its traditional IBM/AIX Unix installed base.

IBM’s strong support for Linux, across IBM Systems platforms, will help drive this growth in cognitive computing workloads. Today, a substantial slice of IBM’s Power business is Linux-on-Power systems running HPC or SAP analytics. Likewise, LinuxONE systems support cognitive computing workloads. Rosamilia told analysts that Power deployments for SAP/HANA real-time analytics are spurring growth in new and existing IBM accounts.

Although the Systems call did not dwell on it, IBM’s market initiatives in Watson for cognitive computing and accelerated analytics can be other elements of an overall IBM cognitive solution for enterprise customers. Of course, IBM will not be alone in providing these types of solutions. We expect to see competition from Dell EMC, HPE and Oracle in these accelerated analytics workload opportunities.



There are many more technical details underlying the distributed computing strategies for cloud computing, hybrid cloud, analytics and cognitive that could not be described here, in a short blog item. There are also many distributed-computing partnerships with companies like Nvidia (for its GPU accelerators and high-speed NVLink); Xilinx (for co-processors) and Mellanox (for high-speed interconnects) – and the OpenCAPI interconnect consortium.

This blog takes the view that it is not the technology alone that gives momentum to this distributed computing initiative. Instead it is having an IBM Systems strategy centered on workloads, connectivity, and end-to-end solutions across infrastructure that is enabling a broader reach for IBM’s Systems point-products.

In short, IBM is no longer speaking of its systems platforms in isolation, pushing the technical specifications of each as the primary reasons for customers’ consideration of new Systems products. By supporting, and building on, distributed models for computing, IBM is showing the business value of its platforms in the broader world of end-to-end workloads spanning corporate and cloud data centers.


Read More]]> (Jean Bozman) Cloud Computing Fri, 06 Jan 2017 18:38:17 +0000
2017 Predictions from the Hurwitz Team

Last year we predicted that it would be a year of consolidation and a time for businesses to absorb emerging technologies. We think that this trend wil continue.

This year we are digging in around seven different areas. Each area will have three predictions from our impressive analyst team. It is a lot to look at but we are hopeful that you will find these predictions interesting and insightful. We want your feedback. Engage in a dialog with us in the new year.

Getting the job done in 2017

The focus for 2017 is on getting the job done. Therefore, the key trends will be around pragmatic ways of implementing emerging and innovative technologies in a way that delivers fast, predictable, and well-managed solutions. It may not sound sexy but the pressure is on businesses to deliver differentiated services to customers faster and more effectively than ever.

We have divided our 2017 predictions into seven areas:

DevOps -- Making DevOps the way to continuously change and manage applications to support business change

Analytics & Cognitive -- Transforming data in new ways so that data can be used to transform the customer experience, leverage data to augment the ability to develop solutions to problems that have been unsolvable in the past.

Hybrid Cloud -- Providing the appropriate delivery models to satisfy customers

IoT -- Leveraging IoT data to deliver game changing results for industrial businesses

Servers & Storage -- Making infrastructure scale to the needs of the business

Security -- Creating environments that are well protected with advanced security and governance

Social Business --Bringing Digital business into a new level of maturity will change how businesses interact with their best customers.


What our analysts have to say:

Dan Kirsch on DevOps

DevOps began as an experiment – fusing development organizations with operations. Now many organizations have adopted DevOps to help meet the increasing expectations of both internal business users and end-user customers. Below are our 2017 predictions for DevOps:

Prediction #1. As IT organizations move towards an IT as a Service model, a marketplace of tools for developers supported by standardized APIs will change the way developers must collaborate with IT operations. IT organizations will need to support the requirements of the business or business units will continue


Prediction #2. Containers and micro services are beginning to become a pervasive approach to developing and deploying more flexible and agile applications.  We expect a micro services industry to emerge in 2017.


Prediction #3. Open source vendors will have to move from a support model to a solutions creation model if they are to survive. Too many open source vendors are unable to gain enough revenue to sustain themselves in a highly competitive market. We expect those that only provide support without a significant ecosystem of partners and solutions will not survive the year.


Dan Kirsch on Analytics & Cognitive

Just a few years ago, analytics experts and data analysts were trying to convince business users that predictive analytics could be trusted. The market has shifted and now C-level executives and asking how analytics can be applied in nearly every part of the business. Below are our 2017 predictions for analytics and cognitive computing.

Prediction #1. The emergence of embedded analytics into a broad array of applications will significantly impact the growth of the standalone analytics tools market. Increasingly analytics are being built into increasingly more business and consumer applications. Embedded analytics allow users to make fast data-driven decisions without the need to run complex queries, seek assistance from data analysts or move to a separate BI / Analytics application. Therefore, vendors of standalone tools will have to move to an industry focus to differentiate themselves.


Prediction #2. Artificial Intelligence and machine learning algorithms are becoming pervasive across both horizontal and vertical applications. The initial impact will be subtle (i.e., better targeting of customers). The significant impact will come from cognitive solutions that provide breakthrough solutions to difficult problems in vertical markets such as healthcare, manufacturing, and finance.  


Prediction #3. The role of the chief data officer is going to become an increasingly strategic in companies. These professionals will be in charge of driving digital transformation as companies realize that data is the only way to take on emerging threats.



Judith Hurwitz on Hybrid Cloud

While it is clear that the cloud market is maturing. Customers are beginning to understand the use cases for public versus private clouds in a much more pragmatic way than only a few months ago. We are seeing public cloud vendors moving to offer their cloud services on premises. We are also seeing private cloud vendors branching out to public clouds.   We expect to see a lot of change in hybrid cloud in 2017


Prediction #1. Businesses are considering private cloud services that are not connecting to the internet because of security fears. 2017 will be a year of contradictions and fear. Companies with sensitive workloads will decide to disconnect from the internet through a private cloud appliance.


Prediction #2. While security has always been the top issue for companies moving to the cloud, a larger percentage of businesses will select vendors based on their security infrastructure. This could be the year of hybrid cloud security.


Prediction #3. Finally, we will see a large number of hybrid cloud management solutions hit the market. As more businesses increase their reliance of a variety of

cloud based infrastructure, platform, and application services, the need to understand what and how various services are used will be a demand. Management will expand to cost considerations and capabilities based on the use case.


Judith Hurwitz on IoT

While sensors and its semi-structured data has been around for decades, we are finally getting to the point where IoT data is being coupled with high power clustering technologies such as Spark and scalable converged systems. Applying advanced analytics in maturing tools is beginning to turn a collection of data elements into a sophisticated data analytics platform that can be applied to real solutions to pragmatic problems. We expect to see a slew of solutions ranging from smarter traffic, to smarter factories, and smarter device management. Given this evolution what do we expect to happen in 2017?


Prediction #1. Our first prediction for IoT is that the growth will be in industrial sector rather than the consumer sector.


Prediction #2. We expect to see a huge number of startups focused on real time data for the IoT market. These startups will primarily focus their attention on the Spark Framework as the foundation for offerings.


Prediction #3. There will be more and more attention paid to the security side of IoT given some of the opportunities for mischief – i.e. hacking into cameras and sensor based devices. While there will be few startups in the IoT security market this year, we will expect to see security offerings customized for IoT from established security vendors.


Jean Bozman on Hardware

Servers and storage vendors find themselves in a highly competitive marketplace, with many new entrants in the hardware marketplace selling systems at commodity prices. This dynamic is causing large systems vendors to offer built-in differentiated features, and to highlight their services for IT transformation. We are seeing more integrated systems that combine servers and storage into converged systems for quick installation, workload consolidation and IT simplification. We are seeing more systems with automation to speed deployment, improve IT flexibility and reduce operational costs. In 2017, the traditional categories of servers and storage are giving way to definitions based on system capabilities to support specific workloads.

Prediction #1. More servers, storage devices and converged systems (combining servers and storage) will go all-flash in 2017.

Flash-enabled systems – those using solid-state disks (SSDs) in place of hard-disks (HDDs) – will see rapid growth in 2017. Using flash storage in these systems will give them high performance and low latency, while reducing data-center space requirements. This impacts business workloads, databases and analytics, providing faster time to results than older hard drives using mechanical, spinning disks. In 2017, more systems will go “all-flash.” Drivers for the all-flash designs include: reduced price per storage device, increased durability and more converged or hyper-converged systems using flash storage.


Prediction #2. Management software will become a key differentiator for systems vendors.

Orchestration and management of workloads are the key software tools that make software-defined infrastructure (SDI) a practical option for enterprises. Offering customers an integrated platform, with virtualized hardware, advanced management software and automation, helps customers to transform their data centers and achieve greater IT flexibility and operational efficiency. In 2017, the prospect of gaining unified views of datacenter systems – and better IT staff productivity – will drive more vendor focus on easy-to-use management software.


Prediction #3. High Performance Computing (HPC) and analytics are converging – via a common infrastructure of Linux servers.

Inside enterprise data centers and cloud data centers, HPC and analytics are being deployed on the same infrastructure – usually running Linux on low-cost, scale-out servers. Many customers are finding that HPC data, such as weather data, can inform broader analytics workloads—improving the accuracy of business forecasts for retail store sales, oil/gas exploration, agriculture and manufacturing. By running HPC and analytics on the same hardware, IT personnel can manage both workloads, redirecting processing to available servers and storage. In 2017, more customers will find that leveraging HPC data in analytics brings better business outcomes.


Chris Christiansen on Security

For 2017, passengers on cybersecurity flights better fasten their seat belts low and tight because we expect severe turbulence. The 45th President of the United States (POTUS) is going to demand that technology firms cooperate with law enforcement and government intelligence agencies. This will set a precedent for governments worldwide, but their balkanized interpretations will be driven by unique social, political, and religious situations. IoT explodes as a target and a vector for security threats. In a dramatic reversal, public cloud will become a refuge for many enterprises and SMB that are out-gunned by their attackers and unlikely ever to achieve parity let alone superiority.

Prediction #1. The 45th POTUS explodes the cybersecurity market. Data security becomes critical as law enforcement and government agencies get carte blanche for internal and external mass surveillance. Privacy laws are Balkanized according to religious, social, and political beliefs by the US and other countries. Governments increasingly regulate encryption and source code. Major countries use social network profiling to control immigration, business licenses, job prospects, and social services.


Prediction #2. IoT drives an exponential growth in security threats. Consumer and surveillance cameras spawn record-breaking DDOS and DNS attacks. National security threatened as cyber war against critical national infrastructure heats up old and cold wars.


Prediction #3. Cloud becomes the secure refuge for companies worldwide. Next Generation (N2G) security radically alters infrastructure as appliances are virtualized and hybridized across private and public cloud-based data centers. Except at the high-end enterprise and carrier level, hardware becomes a Kleenex-like disposable. Many small business’ find security too difficult and expensive so they eliminate on-premise appliance and moves entirely to cloud-based managed and SaaS security services for predictable costs, lower risk, and better user experience.

Vanessa DiMauro, Predictions for Digital Transformation

The digital discipline is maturing and practitioners are becoming more accountable and more tightly integrated into the lines of business. The age of experimentation with digital is ending. There is a clear imperative to reach customers earlier in their buyer journey through online strategies. Below is what I foresee the future holds for marketing leaders, digital officers and community builders everywhere.

Prediction #1: A big shift toward selective customer intimacy

Companies are beginning to understand that all customers are not created equal. We predict al huge movement towards targeting digital strategies where there are campaigns designed to meet the needs and expectations of important customers. In 2017, we’ll see a dramatic rise in private customer communities, velvet-rope customer retention programs, and tiered social selling programs focused on “elite” customers.


Prediction #2: The end of acceptance of vanity metrics and unclear digital outcomes

Increasingly stakeholders expect digital transformation leaders to provide a clear return on their investment. While 2016 marked the dawning of digital accountability, 2017 will bring formalized accounting standards and practices for accurately measuring business across industries (think GAP analysis for Digital Accounting).


Prediction #3: Digital impact will become strategically integrated into core operations. It is now clear that digital campaigns have to move beyond “Facebook likes”. The focus for 2017 is moving to a digital transformation strategy that support the firm’s strategic directions.

Read More]]> (Judith Hurwitz) Uncategorized Thu, 15 Dec 2016 17:16:44 +0000
The Lie of Digital Transformation

I’m not going to mess around. If I hear one more person talk about a digital transformation strategy I think I might scream. But I digress. What do company executives really mean when they say that they want to move to a digital transformation strategy? I think that if you talk to 10 executives you will probably get 10 different answers. The answer to the question is a lot more complex than simply expecting your company to be the “uber” in your market.

So, let me get to my point. Digital transformation really means a forward-looking strategy. Many well-established companies whether we are talking about transportation, manufacturing, retail, or electronics have a difficult time changing. The answer isn’t that mysterious. How can you take your intellectual property, your customer base, and your established brand in the market and translate it into a new generation? It turns out to be more difficult than it looks. Think about the companies that come up first on the radar when discussing digital transformation. They are companies like Uber, Airbnb, and Amazon. They all have one factor in common: they came to the market with no legacy products and revenue tied to those products. They didn’t have to invest in physical inventory or complex infrastructure to support their customers and their business. This is both a strength and a weakness. There are hundreds maybe thousands of digital transformation companies that fail and fail hard. The ones that succeed do so because they have mastered the management of complex data.

Does that mean that traditional companies are lost in an era of digital transformation? Do these companies have to throw it all away and start from scratch? No. I maintain that to be successful traditional business leaders have to do two things. First, they need to take a hard look at what has made them successful in the past. What products and services do customers love? What is the underlying intellectual property that has taken the company this far? Who are the customers and what are they looking for next that they can’t get today? Second, businesses need to be willing to walk away from products or services that do not produce growth. Just because you have always offered a particular product doesn’t mean that it will sustain you into the future.

True transformation requires bravery and imagination to see beyond your traditional ways of doing business. It is not a quick answer to a complex problem. Transformation requires a roadmap that moves you forward in order to take your knowledge and codify it into data. Transformation means that you have to take risks that are often outside of your comfort zone. Don’t simply ask customers what changes they want to see in existing products. Help them to work with you and look beyond what exists to what is possible.

If you take the roadmap steps of moving forward with data and out of the box thinking then you can begin your journey towards transformation. Ironically, established companies with well-known brands, loyal customers, and an innovative set of ideas can leapfrog the competition if they follow the data. By following this path you will be prepared to future proof your business and become a disruptor and not a victim.

Read More]]> (Judith Hurwitz) Vendor Strategy Tue, 13 Dec 2016 20:09:44 +0000