For those who may have noticed, I should explain my long absence from this blog. For the better part of this year my team and I have been “heads down” on preparing for and executing the introduction of the new IBM PureData System. Not having much time to spare was only a part of my excuse. The real reason was lack of energy and inspiration to write even one more piece beyond what was needed for the launch and for the IOD 2012 Conference last week..
Now that both events are behind us, it is time for me to get back on track….
PureData System is the newest member of the IBM PureSystems family of expert integrated systems I wrote about in April. It is offered in 3 models that deliver optimized performance for transactional, analytic and reporting, and operational analytic workloads. As an expert integrated system, each PureData System model is integrated software, hardware and built-in expertise that simplify the entire system life cycle – from procurement through retirement.
PureData System provides an efficient, high-performance and high-scale data platform – delivering data services needed for different types of transactional and analytic application workloads. Providing these values for data services needed for different types of applications requires software and hardware that are designed, integrated and tuned specifically for each type. Typically, organizations spend their valuable time and resources to design systems of general purpose components and then procure, integrate, configure, tune, manage and maintain each system for its specific use. PureData System dramatically reduces time, cost and risk when deploying and maintaining these systems.
- PureData for Transactions: integrates DB2 pureScale to deliver high-available, high-throughput transaction database clusters that easily scale without the need to tune the application or database. This PureData System is available in 3 size configurations and can be used to consolidate more than 100 database servers.
- PureData for Analytics: is powered by Netezza technology and is the newly enhanced replacement to the Netezza 1000 (formerly known as TwinFin). It is optimized for simplicity and performance for analytics and reporting data warehouses. This new model delivers 20x concurrency and throughput for tactical queries compared to the previous version Netezza technology, and offers the industry’s richest library of in-database analytics functions.
- PureData for Operational Analytics: integrates InfoSphere Warehouse software for operational data warehousing that can support continuous data ingest and more than 1000 concurrent operational queries, while balancing resources for predictable analytics performance. It also delivers DB2’s adaptive compression which has been used by clients to achieve up to 10x storage space savings. This PureData System model is a new generation that replaces the Smart Analytics System 7700.
And if that were not enough, we have also integrated the power and simplicity of Netezza technology with the reliability and security of System z to deliver cost efficient, high-performance analytics and operational analytics on data manages by DB2 for z/OS. System z clients now have the opportunity to greatly simplify and reduce cost of analyzing their most critical business data.
- DB2 Analytics Accelerator: The same Netezza technology that powers the PureData System for Analytics, also powers the newly enhanced DB2 Analytics Accelerator which integrates with DB2 for z/OS for high performance analytics – without modifying applications or the database. The new High-performance Storage Saver capability reduces demand on System z storage space without sacrificing performance.
- zEnterprise Analytics System: combines the new zEnterprize EC12 and DB2 Analytics Accelerator for a hybrid system that merges capabilities optimized for different workloads in a single, highly reliable, and secure system. The zEnterprise Analytics System 9700 and 9710 models have now replaced the Smart Analytics System 9700 and 9710.
That’s a good (re-)start… I will save my IOD 2012 recap for next week to make sure I get back on my weekly pace.
PS. My thoughts and prayers are with all those still suffering the effects of Sandy.
Here is another question where conventional wisdom about “the right answer” has been proven wrong: can IBM System z be the best solution for data warehousing and analytics? For many of my early days in the database software and systems business the debate raged about performance and price performance implications of using System z for analytics workloads. Recent client stories I’ve heard tell me that the advances delivered in DB2 10 for z/OS, and the Netezza powered DB2 Analytics Accelerator, have firmly answered the question.
For those that have not heard of DB2 Analytics Accelerator, it is a Netezza data warehouse appliance that integrates directly with DB2 for z/OS such that deep analytics queries are routed to it without any need to alter the application. Transactional and operational queries are handled by DB2 as usual, and all data remains under the industry’s highest level of security and availability.
Also, you should know the Smart Analytics System models 9700 and 9710 are integrated offerings that include Cognos BI, InfoSphere Warehouse and DB2 for z/OS software on a zEnterprise z/196 or z/114, respectively.
If you are finding it hard to believe this is a real change in the game, consider the following client examples from our Banking Industry team:
European Bank Group adds IBM DB2 Analytics Accelerator to System z over Exadata
This banking consortium has IT teams that are Oracle technology friendly, and had invested in an Exadata system last year. They were considering moving BI workload to the Exadata system but the IBM team demonstrated the benefits of a BI infrastructure based on IBM System z with the DB2 Analytics Accelerator. The client chose the IBM solution.
Federal tax authority chooses IBM Smart Analytics System 9700 after DB2 10 for z/OS blows away Oracle in a performance benchmark
A benchmark between Oracle Database and DB2 for z/OS was the first step in this decision process: DB2 proved to have 10 times better performance in the benchmark. In addition to superior performance, other decision factors for choosing IBM Smart Analytics System over Oracle included:
- An end-to-end solution, including comprehensive data warehousing and business intelligence software
- Reliable hardware
- In-depth services that will support deployment and operation of the new platform
IBM System z selected over Teradata at one of the world’s oldest banks
This bank needed an integrated data warehousing solution for corporate, financial, and marketing information across the bank to reduce costs, improve revenue and drive better profitability. Factors in choosing IBM System z over Teradata included:
- Significant savings in hardware, software, operating and people costs
- Faster time to value with a reduction in the time required to deploy Business Intelligence solutions
- Industry leading scalability, reliability, availability and security
- Simplified and faster access to the transactional and operational data on System z
North American Bank moves off Teradata in Favor of IBM Smart Analytics System
Teradata was the warehousing standard at this bank and its team had a misconception that IBM System z was not leading-edge technology or the most cost effective solution. Fortunately, the team also had open minds and a desire to find the data warehousing and analytics solution that delivered the best value for their business. The result: a transition from Teradata to an IBM Smart Analytics System powered by System z.
Never say never
Now don’t get me wrong. I am not saying that System z is the best analytics system choice for all clients in all situations. I am saying that you should not assume it isn’t the best choice for you and your situation. Make business decisions based on the reality of today’s facts, not based on outdated misconceptions.
I am starting this post at 31,000 ft on way home from Atlanta where I spoke at another CIO Forum and Executive IT Summit. I also spoke at one in Seattle earlier this year. These are well run events that are specifically for CIOs and senior IT executives. I really enjoy hearing the exchanges among these peers who are facing many similar challenges, regardless of what their companies do. Exchanges are often very productive with cards exchanged for follow-on actions or continuation of the discussion.
Full disclosure: IBM sponsors this event series which will be held at 16 cities across the US and Canada in 2012. The keynote discussion is about Big Data challenges and the new technologies for tackling them. We also do a follow-on session on either Evolving Data Warehouse Architectures or Information Integration and Governance. Today I was talking about the evolving architectures.. many of the points I’ve shared in my earlier posts.
Feedback has generally been positive, but even more so today, I thought. I had discussions with IT leaders from higher education, marketing services, financial services, health care, and IT consulting services. They all were talking about the need to evolve their environments to gain more insights from more types of information, and deliver it to business users faster. It made me feel good about the focus we currently have with our clients.
But it also made me think that we are not too far from having to figure out the next chapter in the story. The only way to be among those leading the way forward, is to always be scouting ahead to see what is over the horizon. The great thing about the computing business – at least over my career so far – is that we are never done inventing the future.
Note: Another disclosure.. I am actually finishing and posting this a week later. The holiday weekend with family and friends was too good, the need to connect and post this flew right out of my mind.
I was at a family gathering this weekend where I had the opportunity to talk to a friend who is also now in a leadership role at a large technology company. Our discussion about the growing availability and use of information reenforced my belief that we are experiencing an exciting evolution in the business world that is having a profound global impact. But most of our discussion was not about technology, it was about global skills availability and growth. I had a very similar conversation the day before with one of my neighbors who is also in a technology leadership position.
Thinking about this topic on the drive in this morning, I recalled that am overdue in publicly welcoming new members of our IBM Champions program. IBM Champions are IT professionals and educators who make significant contributions to their communities: evangelizing IBM solutions, sharing technical knowledge and expertise, and growing and nurturing independent communities.
We recently expanded the set of Information Management Professionals in the IBM Champion program to 158 Champions across 28 countries. My thanks to all of these folks who contribute their time and talent to strengthen and expand their professional communities.
Given the global need for these skills.. and the need to develop individuals for well paying jobs, there is a tremendous opportunity for educators and experienced mentors to help grow a new generation of Information Management professionals and data scientists.
P.S. For anyone noticing the drop off in my blogging pace lately, we can blame spring fever and my need to spend time at family events… and on my yard and golf game. (I have lots of work to do on the latter to keep up with my son!)
Its been a couple of weeks since my last post. I have to admit that after all the material I wrote, reviewed, and edited for two recent product launches – DB2 & InfoSphere Warehouse 10 and IBM PureSystems – I needed a bit of a rest. And before I forget – kudos and thank you to the not-so-small army of folks from the organization I lead, and from across the many IBM organizations that generated all that material and drove the launch activities around the world.
I already wrote about DB2 and InfoSphere Warehouse 10, and set up the PureSystems news with a post about Expert Integrated Systems. I wanted to save my post about IBM PureSystems to comment on the announcement experience and reaction. I was fortunate to be in London for discussions with press, analysts, partners and clients who attended our announcement event. For us product marketing types – nothing beats seeing and hearing the immediate reaction to the product story we have helped develop over months (or even years).
A new era of computing with Expert Integrated System
I have to say I am generally pleased that the value of this new class of systems was understood by those who attended our events around the world and on-line. I am also not at all surprised that the reaction of many is: “I will need to see this for myself, it sounds too good to be true.” This launch experience reminds me of the 2001 launch of eclipse.org and the WebSphere Studio portfolio built on the then new eclipse technology.
At the time, most were skeptical of the future we painted about development and operations tools being built by an ecosystem of providers to “plug-in” to a common platform – creating a new level of cost efficiency and team productivity. A couple of years later, no one was questioning that this new approach had changed the game and set a new level of expectations among clients.
I predict that in a year or two, no one will be questioning the new level cost efficiency and productivity that result from using systems that come with, and easily plug-in, expertise from IBM, a growing ecosystem of solution providers, as well as client IT teams. These systems are designed to improve the experience and economics of IT – simplifying and speeding solution development, deployment and ongoing management – in both traditional and cloud computing environments. It goes beyond the level of bundling and integration we have seen in the industry thus far.
Unlocking the resources needs to make the next leap forward in business computing
One question I heard a lot in the process of introducing IBM PureSystems was: “So does this built in expertise and automated deployment and management mean this is about helping clients cut IT staff?” For those of you with the 18+ months of project backlogs; or the 23% reporting projects behind schedule, over budget or both; or those who are unable to find enough expert skills to keep up with the demands of your business – you know this is about doing more with the resources you have. IBM PureSystems are designed to free your valuable resources – people and money – from the mundane, repetitive and error prone tasks, so they can focus on delivering the new value that business leaders are demanding.
Discussing this topic with folks reminded me of a Curt Monash post about the future of enterprise application software. Curt expands on a Sarah Lacy post about the change due in enterprise applications. Several of the points Curt and Sarah make line up with topics I have already covered about the implications of the new era of data management we have entered. Curt cites several factors driving updates to applications:
- better integration with communications technology
- better integration with analytics technology
- better use of different kinds of data – e.g., machine generated
- integrating social software
- taking advantage of software as a service
IBM PureSystems offer a set of capabilities that can help our clients accelerate this move (if not leap) forward. By both freeing resources from today’s burdens and helping them accelerate new innovations and the benefits of cloud computing.
Over 100 years of innovations and still going strong
IBM started over 100 years ago when business machines were scales and time clocks. Since then IBMers have given the world the general purpose business computing system, the Automated Teller Machine (ATM), the Universal Product Code (UPC “bar code”), and many other innovations that have helped transform businesses in all industries. I believe that some future writer will include Expert Integrated Systems on their list of transformative innovations from IBM.
I have been writing a lot about the different data management systems organizations can use today to get the most value from available information. But there are other capabilities that complement these systems are also important for delivering top performance, reliability, security and cost efficiency.
For example in the areas of:
- Speeding time to market and enhance compliance by modeling data assets and automating database design processes
- Empowering developers to write high-quality code more rapidly using optimized frameworks
- Centralizing database health monitoring and job management
- Tuning queries for best performance
- Using reliable change management to alter objects, permissions and dependencies
- Automating upgrades and data migrations
- Improving cross-team collaboration with processes that span database, application and data access requirements
- Increase data quality and consistency with shared policies, models and metadata
- Simplifying team integration with a common, integrated environment for multiple roles
To help organizations balance the increasing information-driven demands, IBM provides a set of solutions to help them optimize performance, improve availability, increase productivity and maximize efficiency of their data environments – for DB2, IMS and Informix across all system platforms. Given this week’s announcement, I should call out….
These solutions help organizations better their daily challenges and simplify:
- Database Development and Administration
- Performance Management and Optimization
- Problem Recovery and Resolution
Many of these capabilities are included in the advanced editions of DB2 and InfoSphere Warehouse at a very attractive price. If interested in how this value stacks up, I suggest you read the
- Average 41% lower license and support costs
- Average 20% lower infrastructure costs, including server and storage systems
- Average 26% personnel investment freed to work projects that deliver new business value
- = Average 37% lower combined costs
Another good read is “Managing the Data Lifecycle” by Holly Hayes.
So basically I am saying that you should use all the best tools available to you to:
- Get the most value from available information
- Optimize your environment for performance, reliability, security and cost efficiency
- Empower your team to deliver greater value to your organization
For a product segment that was supposed to be a boring commodity, there sure is a lot of excitement about new data systems this year. Ok, to be fair, the excitement about data systems is among those of us in the information technology business who appreciates what these systems do for an organization. And what they do for our clients and partners is…
Save valuable time and money
Pick your favorite demand(s) – e.g., handle the explosive growth of data, analyze big data, do more with less – it all boils down to do more, faster and for less money.
As I have been writing, picking the right type of system for each of your data management challenges has a big impact on your ability to achieve those goals. It is also true about picking the right system when a relational database or warehouse is the best choice for your needs.
The idea that these are “commodity” systems may have some merit for applications with modest requirements. But when an organization’s operations and competitive advantage depends on them, the performance, simplicity, and cost differences among your choices can have a tremendous business impact.
I invite you to learn more at our virtual conference event:
“The Future of Database and Data Warehouse Software“
- An industry perspective from analyst Colin White, President BI research
- A tour of new features from Tim Vincent, CTO, IBM Information Management
- The business impact of doing more, faster, for less from an early access program client (video and case study)
- Several sessions about new feature details from the product teams
- An opportunity for live Q&A with experts at the virtual expo hall
A quick sample of the new capabilities that have caused excitement among early access clients, partners and even industry analysts:
- Performance boost for software that was already a market leader – an early access partner measured approx. 10x faster warehouse queries than the previous version
- New Multi-Temperature Data Management and Adaptive Compression that advances leading data compression capabilities to free space for growing data – overall compression up to 90% seen among early access program clients and partners
- Enhanced availability, simplified scalability and geographically dispersed clusters with enhanced DB2 pureScale
- Easier integration of information and insights from non-SQL/relational data (NoSQL) – simplified integration with Hadoop-based systems and New built-in RDF Graph data management, along with the built-in native XML data management, pureXML, introduced in DB2 9
- New Time Travel Query eliminates costly application code currently required to get answers at any point in time – past, present and future. (those of you already on DB2 10 for z/OS should be familiar with this feature)
And for those of you hungry for more product news excitement – and for ways to do even more, faster and for less – you should attend another important IBM event on April 11 to learn about new Expert Integrated Systems from IBM.