Storage Archives | eWEEK https://www.eweek.com/storage/ Technology News, Tech Product Reviews, Research and Enterprise Analysis Wed, 11 Oct 2023 23:36:06 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.2 Pure Storage’s Justin Emerson on Analytics Performance and Flash Storage https://www.eweek.com/big-data-and-analytics/pure-storages-analytics-performance-and-flash-storage/ Wed, 11 Oct 2023 23:36:06 +0000 https://www.eweek.com/?p=223168 I spoke with Justin Emerson, Principal Product Manager and Technical Evangelist for Pure Storage, about how high speed flash storage enables better data analytics performance. Among the topics we discussed: What’s a major challenge that companies face with their data analytics practice? For all the effort and expense, what’s still holding companies back? How can […]

The post Pure Storage’s Justin Emerson on Analytics Performance and Flash Storage appeared first on eWEEK.

]]>
I spoke with Justin Emerson, Principal Product Manager and Technical Evangelist for Pure Storage, about how high speed flash storage enables better data analytics performance.

Among the topics we discussed:

  • What’s a major challenge that companies face with their data analytics practice? For all the effort and expense, what’s still holding companies back?
  • How can companies solve this issue? Is there industry momentum toward any solutions?
  • How does Pure Storage support the analytics practices of its clients?
  • What about ESG and the issue of data storage, with the shift from hard disc to flash? What are the implications for issues like power use and sustainability?

Listen to the podcast:

Also available on Apple Podcasts

Watch the video:

The post Pure Storage’s Justin Emerson on Analytics Performance and Flash Storage appeared first on eWEEK.

]]>
5 Mistakes to Avoid In a Data Storage Refresh https://www.eweek.com/big-data-and-analytics/data-storage-refresh/ Wed, 09 Aug 2023 17:59:46 +0000 https://www.eweek.com/?p=222824 As data storage technology has evolved with more choice and options for different use cases—the flavor of today is AI-ready storage—determining the right path for a data storage refresh requires a data-driven approach. Decisions for new data storage must also factor in user and business needs across performance, availability and security. Forrester found that 83 […]

The post 5 Mistakes to Avoid In a Data Storage Refresh appeared first on eWEEK.

]]>
As data storage technology has evolved with more choice and options for different use cases—the flavor of today is AI-ready storage—determining the right path for a data storage refresh requires a data-driven approach.

Decisions for new data storage must also factor in user and business needs across performance, availability and security. Forrester found that 83 percent of decision-makers are hampered in their ability to leverage data effectively due to challenges like outdated infrastructure, teams overwhelmed and drowning in data, and lack of effective data management across on-premises and cloud storage silos. Leveraging cloud storage and cloud computing, where AI and ML technologies are maturing fastest, is another prime consideration.

Given the unprecedented growth in unstructured data and the growing demand to harness this data for analytical insight and AI, the need to get it right has never been more essential. This article provides guidance on that topic by highlighting what not to do when performing a data storage refresh.

Also see: Top Cloud Companies

Mistake 1: Making Decisions without Holistic Data Visibility

When IT managers discover that they need more storage, it’s easy to simply buy more than they need. But this may lead to waste and/or the wrong storage technology later.

A majority (80%) of data is typically cold and not actively used within months of creation yet consumes expensive storage and backup resources. Plus, given that you can now purchase additional storage instantly and on-demand in the cloud and with storage-as-a-service on-premises, there’s no reason to overprovision.

To avoid this common conundrum, get insights on all your data across all storage environments. Understand data volumes, data growth rates, storage costs and how quickly data ages and becomes suitable for archives or a data lake for future data analytics.

These basic metrics can help guide more accurate decisions, especially when combined with a FinOps tool for cost modeling different options. The need to manage increasing volumes of unstructured data across multiple technologies and environments, for many different purposes, is leading to data-centric rather than storage-centric decision-making across IT infrastructure.

Mistake 2: Choosing One-Size-Fits-All Storage

Storage solutions come in many shapes and forms – from cloud object storage to all-Flash NAS, scale-out on-prem systems, SAN arrays and beyond. Each type of storage offers different tradeoffs when it comes to cost, performance and security.

As a result, different workloads are best supported by different types of storage. An on-prem app that processes sensitive data might be easier to secure using on-prem storage, for instance, while an app with highly unpredictable storage requirements might be better suited by cloud-based storage that can scale quickly.

This again points to the need to analyze, segment and understand your data. The ability to search across data assets for file types or metadata tags can identify data and better inform its management. Avoid the one-size-fits-all approach by provisioning multiple types of storage solutions that reflect your different needs.

Also, less than 25% of data costs are in storage: the bulk of the costs are in the ongoing backup, disaster recovery and protection of the data. So, consider the right storage type and tier as well as the appropriate data protection mechanisms through the lifecycle of data.

For more information, also see: Best Data Analytics Tools

Mistake 3: Becoming Locked into One Vendor

Acquiring all your storage from one vendor may be the simplest approach, but it’s almost never the most cost-effective or flexible.

You can likely build more cost-effective storage infrastructure if you select from the offerings of multiple vendors. Doing so also helps protect you against risks like a vendor’s decision to raise its prices substantially or to discontinue a storage product you depend on.

If you have other vendors in the mix, you can pivot more easily when unexpected changes occur. Using a data management solution that is independent of any storage technology is also a way to prevent vendor lock in, by ensuring that you can move data from platform to platform without the need to rehydrate it first.

Mistake 4: Moving Too Fast

A sense of urgency tends to accompany any major IT migration or update, storage refreshes included. Yet, while it’s good to move as efficiently as you can, it’s a mistake to move so fast that you don’t fully prepare for the major changes that accompany a storage refresh.

Instead, take time to collect the data you need to identify the greatest pain points in your current storage strategy and determine which changes to your storage solutions will deliver the greatest business benefits. Be sure, too, to collect the metrics you need to make informed decisions about how to improve your data management capabilities.

Mistake 5: Ignoring Future Storage Needs

You can’t predict the future, but you can prepare for it by anticipating which new requirements your storage solutions may need to support in the future. At present, trends like AI, sustainability and growing adoption of data services mean that the storage needs of the typical business today are likely to change in the coming year.

To train AI models, for example, you may need storage that can stream data more quickly than traditional solutions. Likewise, implementing data services in order to support FinOps goals might mean finding ways to consolidate and share storage solutions more efficiently across different business units.

Conclusion: The Importance of a Storage Refresh

As organizations move from storage-centric to data-centric management, IT and storage architects will need to change the way they evaluate and procure new storage technologies.

The ability to analyze data to make nuanced versus one-size-fits-all storage decisions will help IT organizations navigate many changes ahead – be they cloud, edge, AI or something else still on the horizon.

Read next: What is Data Visualization

About the author:

Krishna Subramanian is COO, President & Cofounder of Komprise.

Featured Partners: BI Software

Domo

Visit website

Domo puts data to work for everyone so they can multiply their impact on the business. Underpinned by a secure data foundation, our cloud-native data experience platform makes data visible and actionable with user-friendly dashboards and apps. Domo helps companies optimize critical business processes at scale and in record time to spark bold curiosity that powers exponential business results.

Learn more about Domo

Yellowfin

Visit website

Yellowfin’s intuitive self-service BI options accelerate data discovery and allow anyone, from an experienced data analyst to a non-technical business user, to create reports in a governed way.

Learn more about Yellowfin

Wyn Enterprise

Visit website

Wyn Enterprise is a scalable embedded business intelligence platform without hidden costs. It provides BI reporting, interactive dashboards, alerts and notifications, localization, multitenancy, & white-labeling in any internal or commercial app. Built for self-service BI, Wyn offers limitless visual data exploration, creating a data-driven mindset for the everyday user. Wyn's scalable, server-based licensing model allows room for your business to grow without user fees or limits on data size.

Learn more about Wyn Enterprise

Zoho Analytics

Visit website

Finding it difficult to analyze your data which is present in various files, apps, and databases? Sweat no more. Create stunning data visualizations, and discover hidden insights, all within minutes. Visually analyze your data with cool looking reports and dashboards. Track your KPI metrics. Make your decisions based on hard data. Sign up free for Zoho Analytics.

Learn more about Zoho Analytics

Sigma

Visit website

Sigma delivers real-time insights, interactive dashboards, and reports, so you can make data-driven decisions on the fly. With Sigma's intuitive interface, you don't need to be a data expert to dive into your data. Our user-friendly interface empowers you to explore and visualize data effortlessly, no code or SQL required.

Learn more about Sigma

The post 5 Mistakes to Avoid In a Data Storage Refresh appeared first on eWEEK.

]]>
IBM’s Diamondback Tape Library Focuses on Security for Hyperscalers https://www.eweek.com/storage/ibms-diamondback-tape-library-focuses-on-security-for-hyperscalers/ Fri, 28 Oct 2022 00:05:01 +0000 https://www.eweek.com/?p=221527 Data storage innovation often gets short shrift in digital transformation discussions where it is simpler to focus on the advancements of silicon, chipset and system solutions. But the fact is that improvements in storage capabilities like capacity and data read/write speeds are comparable to – or even greater than – what compute performance has achieved. […]

The post IBM’s Diamondback Tape Library Focuses on Security for Hyperscalers appeared first on eWEEK.

]]>
Data storage innovation often gets short shrift in digital transformation discussions where it is simpler to focus on the advancements of silicon, chipset and system solutions. But the fact is that improvements in storage capabilities like capacity and data read/write speeds are comparable to – or even greater than – what compute performance has achieved.

Those and other issues make IBM’s recent introduction of its new Diamondback Tape Library both timely and intriguing.

Also see: Best Data Analytics Tools 

IBM’s Diamondback Tape Library

How does IBM’s new tape offering address these points? The company describes the Diamondback Tape Library as “a high-density archival storage solution that is physically air-gapped to help protect against ransomware and other cyber threats in hybrid cloud environments.”

The new solution was designed in consultations with more than 100 hyperscalers, including “new wave” organizations and the Big Five hyperscalers.

IBM notes that Diamondback is designed to provide hyperscalers the means to securely store hundreds of petabytes of data, including long-term archival storage with a significantly smaller carbon footprint and lower total cost of ownership than disk and flash solutions. According to the company, IBM Tape solutions are approximately one quarter the total cost of both spinning disk storage and public cloud archival services.

Individual IBM Diamondback Tape Libraries fit in the same 8 square feet floor space as an Open Compute rack (a 42U, 19” rack). Systems can be ordered fully loaded with LTO-9 tape cartridges and are fully compatible with IBM Ultrium 9 tape drives, which can increase total capacity by up to 50 percent compared to IBM Ultrium 8 technology.

Systems can be deployed in less than 30 minutes and individual libraries can support up to 27 PB of raw data or 69.5 PB compressed data. Customers can also store exabytes of data across multiple Diamondback tape libraries using erasure code software available from IBM and as open source.

Like all IBM storage solutions, Diamondback Tape Libraries support IBM Spectrum storage applications, including IBM Spectrum Archive, and can also be equipped with data encryption and write-once-read-many (WORM) media for advanced security and regulatory compliance. IBM Services are available for deployment, management and support. IBM Diamondback Tape Libraries are generally available for purchase now.

Also see: IBM Storage: Squeezing Enterprise Value into Smaller Form Factors

Final Analysis

Storage media solutions from punch cards to solid state drives (SSDs) all have had their time in the sun, often simultaneously. Outside of specialized use cases, most earlier storage media technologies like punch cards, floppy drives and optical storage have largely fallen out of favor for business storage.

However, enterprise tape solutions, including tape drives, libraries and media have remained a steady and profitable business for well over half a century.

Why is that the case? Primarily because of continuing development and innovations by tape vendors, including IBM, FujiFilm and Sony. But it can also be argued that the flexibility and adaptability of tape storage systems and media have enabled vendors to craft highly effective tape solutions for emerging businesses and use cases.

IBM’s new Diamondback Tape Library is an excellent example of that process. The company has a long history of storage innovations, and robust, massively scalable tape storage has played a central role in IBM’s mainframe business for decades. IBM also has deep expertise in a wide range of enterprise computing processes and understands the business and technological needs of enterprise clients in ways that few vendors can match.

In other words, designing and building a tape storage solution powerful and capacious enough for organizations that regularly store, manage and access data in petabyte and exabyte volumes is hardly a stretch for IBM given its data storage experience and continuing R&D.

It is worth noting that the Diamondback Tape Library will also complement and benefit from the company’s other storage solutions and initiatives, from the IBM TS7700 Virtual Tape Library to the recent announcement that Red Hat’s storage portfolio and teams will transition to IBM Storage.

Overall, IBM’s Diamondback Tape Library qualifies as an example of what the company does best—create and supply new offerings that meet the often-daunting needs of traditional and emerging enterprises, including traditional and “new wave” hyperscalers.

Also see: IBM Storage Announces Cyber Vault, FlashSystem and SVC Solutions

The post IBM’s Diamondback Tape Library Focuses on Security for Hyperscalers appeared first on eWEEK.

]]>
Cloudian CMO Jon Toor on Trends in Object Storage https://www.eweek.com/storage/object-storage-trends/ Thu, 25 Aug 2022 21:18:24 +0000 https://www.eweek.com/?p=221337 I spoke with Jon Toor, CMO of Cloudian, about the key factors driving object storage, and why this storage medium is ideal for cloud deployments. As you survey the object storage market, what trends are driving the market this year? I know the market is growing about 15-20 percent a year. How does object storage […]

The post Cloudian CMO Jon Toor on Trends in Object Storage appeared first on eWEEK.

]]>
I spoke with Jon Toor, CMO of Cloudian, about the key factors driving object storage, and why this storage medium is ideal for cloud deployments.

  • As you survey the object storage market, what trends are driving the market this year? I know the market is growing about 15-20 percent a year.
  • How does object storage contrast with the other leading storage types? When should companies not use object? For what use cases should they use it?
  • How is Cloudian addressing the needs of its clients? What’s the Cloudian advantage in terms of data management.
  • The future of object storage? Do you expect it to keep pace with the growth of other storage methods?

Listen to the podcast:

Also available on Apple Podcasts

Watch the video:

The post Cloudian CMO Jon Toor on Trends in Object Storage appeared first on eWEEK.

]]>
IBM Storage Announces Cyber Vault, FlashSystem and SVC Solutions https://www.eweek.com/storage/ibm-storage-announces-cyber-vault-flashsystem-and-svc-solutions/ Fri, 25 Feb 2022 00:39:49 +0000 https://www.eweek.com/?p=220508 Modern IT is undergoing a massive transformation, particularly in the realm of data storage. Adding more cybersecurity features and upgrading performance are all important moves for enterprise storage vendors like IBM, especially for their current customer base. The recent additions IBM announced to its storage portfolio should address top of mind issues for many in […]

The post IBM Storage Announces Cyber Vault, FlashSystem and SVC Solutions appeared first on eWEEK.

]]>
Modern IT is undergoing a massive transformation, particularly in the realm of data storage. Adding more cybersecurity features and upgrading performance are all important moves for enterprise storage vendors like IBM, especially for their current customer base.

The recent additions IBM announced to its storage portfolio should address top of mind issues for many in IT. Let’s take a look.

Also see: How Database Virtualization Helps Migrate a Data Warehouse to the Cloud

IBM Cyber Vault for FlashSystem

IBM Cyber Vault is a new offering that uses IBM FlashSystem Safeguarded Copies to provide validation and verification of copy data so IT can know it’s good. Safeguarded copies are logically air-gapped snapshots of FlashSystem primary storage, providing immutable, incorruptible data copies.

IBM has a number of offerings in the cyber resilience market, including their Cyber Resilience Assessment professional service, QRadar and Guardian software solutions to monitor for data threats from systems and humans. Cyber Vault rounds out their portfolio with validation/verification of data.

Cyber Vault is a blue-printed solution from IBM Labs that takes FlashSystem Safeguarded copies and uses them in a secure VM to provide analysis, scanning, and test/validation, as well as potentially forensic and diagnostic services for Safeguard data.

FlashSystem Safeguarded copies are first copied to a secure Cyber Vault virtual machine environment. Once there, IT can verify and validate that data with whatever tests seem pertinent. Once done, IT knows whether their primary storage (at the time of Safeguarded copy) is good to use to recover from cyber-attack.

Cyber Vault could be used also at a remote disaster recovery site with replicated FlashSystem storage. And because IBM supports Spectrum Virtualize targets on Azure, this whole process could be done on the Microsoft Azure Cloud.

Cyber Vault was already offered on mainframe systems but now this service is available for the open environment using FlashSystem storage Safeguarded copies.

Also see: What is Data Visualization

IBM FlashSystem Storage Upgrades

IBM has also released new FlashSystem 9500 and 7300 storage systems. These include:

  • Faster processors – 4 Intel Ice Lake 24-core, CPUs for 9500 and 4 Cascade Lake 10-core, CPUs for the 7300 system.
  • New PCIe support – Gen 4 for 9500 and Gen 3 for the 7300 system.
  • Larger capacities – 4.5PBe (PBe is effective capacity after data reduction) in 4U for 9500 and 2.2PBe in 4U for the 7300 system.
  • New Gen3 FlashCore Module (FCM) – from 4.8TB to 38.4TB in a single module and ~70msec latency.

All this means lower latency storage access, more storage bandwidth and overall, 25-50% faster storage performance over prior generation storage. The FlashSystem 9500 also offers up to 48 32GFC and is 64GFC ready, with new cards. The new FlashSystems mean up to 2X faster read throughput for AI and in-memory DB workloads, up to 50% more transaction per second for Oracle processing, and 4X better performance on VMware Horizon activity.

IBM also updated the SAN Volume Controller (SVC) appliance with two 24-core Intel Ice Lake CPUs to add more storage virtualization performance to SVC clusters.

Also see: IBM Extends “Tailored Fit” Pricing to Z Hardware

A Boost for Cybersecurity

One can see how IBM’s announcements incrementally improve and build upon past success, at least for cyber security. And performance is a major competitive arena among all storage vendors, which no business can afford to ignore for long. Again, FlashSystem 7300 and 9500 take all this to the next level.

Despite recent quarterly progress, IBM’s storage business has struggled over the past few years. FlashSystem and SVC are not the only solutions in IBM’s storage business, and all have a role to play in altering business trajectory. And the recent news is just the first of four quarterly announcements for IBM’s storage business.

We’d very much like to see how IBM can do more to address some of the other enterprise concerns. For example, the multi-cloud and how to get there. To many, this means Kubernetes, containerization and apps that run anywhere, wherever it makes the most sense, in the cloud, on-prem, or on the other side of the world.

Furthermore, on the horizon are all the new AI and applied data solutions moving into the enterprise. How to become the major storage supplier for these new applications needs to be on every storage vendor’s mind.

We look forward to Q2 and beyond to see what IBM will announce to raise the playing field on these and the other major issues facing IT today.

Also see: Tech Predictions for 2022: Cloud, Data, Cybersecurity, AI and More

About the Author: 

Ray Lucchesi, President, Silverton Consulting

The post IBM Storage Announces Cyber Vault, FlashSystem and SVC Solutions appeared first on eWEEK.

]]>
Seagate’s Jeff Fochtman on Data Storage Strategies to Maximize Data Value https://www.eweek.com/news/seagates-jeff-fochtman-data-storage-maximize-data-value/ Tue, 02 Nov 2021 20:44:46 +0000 https://www.eweek.com/?p=219742 I spoke with Jeff Fochtman, Senior VP, Business and Marketing, at Seagate, about the many challenges of handling data in a distributed world – and how data can offer great value despite this challenge. Among the topics we discussed:  We live in a world where moving data is paramount. With the dramatic growth in both […]

The post Seagate’s Jeff Fochtman on Data Storage Strategies to Maximize Data Value appeared first on eWEEK.

]]>
I spoke with Jeff Fochtman, Senior VP, Business and Marketing, at Seagate, about the many challenges of handling data in a distributed world – and how data can offer great value despite this challenge.

Among the topics we discussed: 

  • We live in a world where moving data is paramount. With the dramatic growth in both Edge and Multicloud deployments, data must move between them quickly. What are some of the challenges here?
  • Despite the challenges, how can companies get more value from data in distributed environments?
  • What kinds of storage strategies can reduce impediments to the movement of large data sets?
  • What do you see as the future of storage in a distributed world? Certainly IT will only get more distributed in the years ahead.

Listen to the podcast:

Watch the video:

The post Seagate’s Jeff Fochtman on Data Storage Strategies to Maximize Data Value appeared first on eWEEK.

]]>
How Database Virtualization Helps Migrate a Data Warehouse to the Cloud https://www.eweek.com/database/how-database-virtualization-helps-migrate-a-data-warehouse-to-the-cloud/ Thu, 14 Oct 2021 21:01:05 +0000 https://www.eweek.com/?p=219661 Database migrations are some of the most dreaded initiatives in IT. Bring up the subject of migrations with any IT executive and one gets a strong visceral reaction. Too many careers have been wrecked by failed migrations. Clearly, conventional techniques using code conversion just don’t work. Especially when it comes to migrating an enterprise data […]

The post How Database Virtualization Helps Migrate a Data Warehouse to the Cloud appeared first on eWEEK.

]]>
Database migrations are some of the most dreaded initiatives in IT. Bring up the subject of migrations with any IT executive and one gets a strong visceral reaction. Too many careers have been wrecked by failed migrations. Clearly, conventional techniques using code conversion just don’t work.

Especially when it comes to migrating an enterprise data warehouse, horror stories abound. Failed migrations that collapsed after three years of hard work are quite common. Migration projects that cost over $20 million before they fell apart are the norm. But ask any IT leader off-the-record and you might learn about much costlier disasters.

As enterprises move to the cloud, modernizing on-prem data warehouses to cloud-native technology is a top priority for every IT executive. So, what is an IT leader to do? How can enterprises avoid migration disasters when moving legacy data warehouses to the cloud?

Over the past year, several vendors brought to market the concept of database virtualization. The principle is quite simple. A virtualization platform lets existing applications run natively on the cloud data warehouse. No or only minimal changes of SQL are required. So, how does database virtualization take the sting out of database migrations?

What is database virtualization?

Think of database virtualization as Hypervisor technology for database queries. The database virtualization platform sits between applications and the new destination data warehouse. Like any virtualization technology, it decouples two otherwise tightly bonded components. In this case, database systems and applications are abstracted from each other.

The database virtualization platform translates queries and other database statements in real-time. In effect, database virtualization makes a cloud data warehouse like Azure Synapse behave exactly like a Teradata or Oracle system. This is quite different from data virtualization. Data virtualization implements a new SQL dialect and requires all applications to be rewritten to this dialect first.

Compared to static code conversion, database virtualization is significantly more powerful. It can emulate complex constructs or data types. Even elements for which there is no equivalent in the new destination database can be emulated in real-time.

Applications originally written for one specific database can now run on a different database without having to change SQL. Instead of static code conversion with all its risks and challenges, database virtualization preserves existing applications. Instead of months of rewriting application logic, database virtualization makes the new database instantly interoperable with existing applications.

Virtualization separates migrating to cloud from application modernization

Database migrations typically fail because of an explosive increase in scope while the project is afoot. It starts as a simple mission where making existing applications work with a new database is the goal. However, once it becomes clear how extensive the rewrites will be, the scope of the project often changes.

Stakeholders may view the operation as a unique opportunity to modernize their application. If the application needs significant rewrite, they argue, why not make other important changes too? What started as a supposedly minimally-invasive operation turns now into full on open heart surgery.

In contrast, database virtualization lets applications move as-is. All changes are kept to the absolute minimum. In practice, the extent of changes to applications is around 1%. With cloud data warehouse technology evolving rapidly, we expect the need for even those changes will be further reduced in the future.

Database virtualization changes the above dynamics quite significantly: move first, modernize applications afterward—and only if needed. Once the enterprise is cloud-native, a few select applications may be candidates for modernization. Separating move and modernization is critical to controlling the risk.

Virtualization overcomes the dreaded 80/20-nature of migration

No other IT problem is so often underestimated. We attribute the error in judgement primarily to the fact that it is an incredibly rare operation. Most IT leaders have never planned, nor executed, a major database migration. If they could help it, they made database migrations their successor’s problem.

Once a rewrite project is underway, the initial progress can be exhilarating. Within just a few weeks, the “easy” SQL snippets are converted rapidly. Many just need substituting a few keywords and similarly trivial changes. In true 80/20 fashion, the first 80% take up only very little time and almost no budget. Then comes the last 20%. This is where hard problems are—and disaster strikes.

In contrast, database virtualization does not distinguish levels of perceived difficulty. Instead, progress is made uniformly. This is not to say there are no challenges to tackle. However, compared to code conversion, the effort needed to overcome those is typically an order of magnitude smaller.

Virtualization mitigates risks

Conventional migration is a high-risk undertaking. As we’ve seen, it starts with underestimating the effort needed. The limitations of rewrite-based approaches are impossible to know up front, despite best efforts. Yet, IT leaders are often asked to put their careers on the line in these projects.

Importantly, with rewrite-based approaches, IT is shouldering the responsibility mostly alone. They are tasked to complete the migration, and then the business gets to judge the outcome.

Compare this to database virtualization. From the get-go, applications can be tested side by side. IT signs up business units early on who can test drive their entire function using their existing tools and processes. Database virtualization promises to relieve IT from taking the risk of implementing something the business cannot use once complete.

On top of that, database virtualization comes with one rather obvious mechanism of risk mitigation. Until the old system is decommissioned, the organization can always move back to the old stack. Reverting back requires no special effort since all applications have been preserved in their original functionality.

Replatform IT to the public cloud

Major enterprises are about to replatform their IT to public cloud. However, so far only a fraction of on-prem systems and processes have been moved. The specter of database migrations is holding enterprises back as all critical workloads are tightly connected to database systems.

Database virtualization is therefore a powerful paradigm for IT leaders who are considering a database migration. While still a young discipline, database virtualization has proven its mettle with notable Global 2000 clients already. So far, its proof points are limited to enterprise data warehousing. However, little imagination is required to see how this technology could apply to operational databases as well.

Database virtualization should be viewed as a critical weapon in the IT leader’s quiver to attack migration challenges when an efficient way to migrate data to the cloud is called for.

About the Author:

Mike Waas, Founder and CEO, Datometry

The post How Database Virtualization Helps Migrate a Data Warehouse to the Cloud appeared first on eWEEK.

]]>
#eWEEKchat October 12: DataOps and the Future of Data Management https://www.eweek.com/big-data-and-analytics/eweekchat-october-12-dataops-and-the-future-of-data-management/ Fri, 01 Oct 2021 16:54:21 +0000 https://www.eweek.com/?p=219570 On Tuesday, October 12, at 11 AM PT, @eWEEKNews will host its monthly #eWEEKChat. The topic will be “DataOps and the Future of Data Management,” and it will be moderated by James Maguire, eWEEK’s Editor-in-Chief. We’ll discuss – using Twitter – important trends in DataOps, including market trends, key advantages, best practices, overcoming challenges, and […]

The post #eWEEKchat October 12: DataOps and the Future of Data Management appeared first on eWEEK.

]]>
On Tuesday, October 12, at 11 AM PT, @eWEEKNews will host its monthly #eWEEKChat. The topic will be “DataOps and the Future of Data Management,” and it will be moderated by James Maguire, eWEEK’s Editor-in-Chief.

We’ll discuss – using Twitter – important trends in DataOps, including market trends, key advantages, best practices, overcoming challenges, and the ongoing evolution of data management in today’s IT sector. DataOps is a “new-ish” idea, yet it’s an important emerging technology.

How to Participate: On Twitter, use the hashtag #eWEEKChat to follow/participate in the discussion. But it’s easier and more efficient to use the real-time chat room link at CrowdChat.

Instructions are on the DataOps Crowdchat page: log in at the top right, use your Twitter handle to register. The chat begins promptly at 11 AM PT. The page will come alive at that time with the real-time discussion. You can join in or simply watch the discussion as it is created.

Special Guests, DataOps and the Future of Data Management

The list of data storage experts in this month’s Tweetchat currently includes the following – please check back for additional expert guests:

Chat room real-time link: Go to the Crowdchat page. Sign in with your Twitter handle and use #eweekchat for the identifier.

The questions we’ll tweet about will include – check back for more/ revised questions:

  1. DataOps is still a new-ish term — how do you briefly define it?
  2. Do you think that DataOps is a mainstream approach in today’s enterprise?
  3. Why is DataOps important in today’s data-intensive world?
  4. What’s DataOps’s greatest challenge: Cohesion between the teams? Process efficiency?  Diversity of technologies?
  5. Apart from the challenges listed above, is DataOps’s greatest challenge human or technological?
  6. Can a company “buy” DataOps or is it simply a process to implement? Will is adopt a SaaS model?
  7. So many vendors claim to do DataOps – and they approach it differently. Is the concept losing clarity?
  8. What industries do you see benefitting the most from DataOps?
  9. What do you see as a core best practice for DataOps?
  10. Any predictions for the future of DataOps?

Go here for CrowdChat information.

#eWEEKchat Tentative Schedule for 2021*

Jan. 12: What’s Up in Next-Gen Data Security
Feb. 9: Why Data Orchestration is Fast Replacing Batch Processing
March 9: What’s Next-Gen in Health-Care IT?|
April 13: The Home as Enterprise Branch
May 11: Next-Gen Networking Products & Services
June 8: Challenges in AI
July 15: VDI and Enabling Hybrid Work
Aug. 17: DevOps & Agile Development
Sept. 14: Trends in Data Storage, Protection and Privacy
Oct. 12: DataOps and the Future of Data Management
Nov. 9: New Tech to Expect for 2022
Dec. 14: Predixions and Wild Guesses for IT in 2022

*all topics subjects to change

 

The post #eWEEKchat October 12: DataOps and the Future of Data Management appeared first on eWEEK.

]]>
Top Cloud Service Providers & Companies https://www.eweek.com/cloud/cloud-service-providers/ Thu, 23 Sep 2021 22:50:41 +0000 https://www.eweek.com/?p=219524 Cloud computing providers play a foundational role for businesses. Virtually every enterprise uses cloud computing in some manner, whether it’s to deliver key infrastructure and services, host applications or a content delivery network (CDN), or handle machine learning and software development. The convenience and economics of cloud providers make them increasingly appealing. Cloud deployments are: […]

The post Top Cloud Service Providers & Companies appeared first on eWEEK.

]]>
Cloud computing providers play a foundational role for businesses. Virtually every enterprise uses cloud computing in some manner, whether it’s to deliver key infrastructure and services, host applications or a content delivery network (CDN), or handle machine learning and software development.

The convenience and economics of cloud providers make them increasingly appealing. Cloud deployments are:

  • Typically fast and easy to provision.
  • Deliver enormous flexibility and are always-on.
  • Boost speed and performance.
  • Move organizations away from a cash-intensive CAPEX model toward a more budget-friendly OPEX framework.

Cloud frameworks also support an array of emerging digital technologies, mobility, artificial intelligence (AI) and machine learning (ML), and the Internet of Things (IoT). As networks expand, the edge becomes more important, and a greater need to manage data in an agile way takes shape, clouds deliver evolutionary and sometimes revolutionary gains.

How Big is the Cloud Services Market?

According to Gartner, the worldwide public cloud services market grew 40.7% to $64.3 billion in 2020. “Hyperscale providers are continuing to build distributed cloud and edge solutions that extend the public cloud’s reach into private and on-premises locations, addressing the needs of organizations relating to data sovereignty, workload portability and network latency,” noted Sid Nag, research vice president at Gartner.

How Do You Select the Best Cloud Provider?

With so many vendors and choices, selecting a cloud computing provider can prove daunting. There are numerous factors to consider, including the type of services to use, how to integrate Infrastructure-as-a-service (IaaS), Platform-as-a-Service (PaaS) and/or Software-as-a-Service (SaaS), and whether to use public or hybrid clouds – or both.

Understanding cloud features and options, performance, availability, pricing, security and compatibility with other software and systems is also vital. Your organization may need to use a more complex multi-cloud framework.

Here are some key things to focus on if you’re in the market for cloud infrastructure and platform software:

  • What does your organization require? Different cloud platforms are better suited to different organizational requirements—and geographic footprints vary greatly. Similarly, different platforms focus on different services. A few of the factors to consider include your dependency on legacy systems currently in place, the type of cloud services needed, the applications and services they support, security and compliance requirements, and overall scalability and flexibility.
  • What does the pricing model look like? Not surprisingly, pricing varies greatly among vendors. A cloud provider may be a more economical fit, based on the specific needs of your organization. However, it’s critical to look beyond the basic price tag and understand the total cost of integrating the cloud into existing services and applications, and what overall value it delivers.
  • What is the vendor’s commitment to performance and availability? All cloud providers promise high availability, but not all availability is the same. You may need 2 nines (99.99 percent) or up to 5 nines (99.999), depending on the use case and the level of resiliency required. No less important: make sure you understand the service level agreement (SLA) before signing on the dotted line. How does it ensure availability after a natural disaster or system failure? What will the vendor do if it fails to meet promised standards?
  • What is the vendor’s commitment to security? This includes several critical areas: physical protections for data centers, cybersecurity standards and protections the vendor has in place, accreditations, compliance, SLAs, legal protections and compensation if something goes astray. In addition, it’s essential to know how and where data is stored.
  • What is the vendor’s roadmap and what is its commitment to support? Although a vendor may offer products and services you need today, there’s no sure bet that this will be the case next year. You can reduce the turbulence of any future changes by understanding the vendor’s roadmap, its commitment to support, but also how easy it is to change cloud providers, if this becomes necessary.

Leading Cloud Providers

Amazon Web Services (AWS)

Amazon Web Services is generally viewed as the number one global vendor for cloud services. It offers more than 200 IaaS, PaaS and SaaS cloud services, including public and hybrid offerings. These include high performance computing, edge compute, e-commerce, containers, Internet of Things, machine learning, virtual reality/augmented reality, and serverless compute. AWS has a presence in 245 countries, with 81 availability zones.

Pros

  • Extensive global infrastructure.
  • Simple sign-up process with fast deployments.
  • Offers an extensive collection of APIs, tools and resources.
  • Broad partner network.
  • Essentially limitless capacity with high scalability and flexibility.
  • Provides centralized and flexible billing for numerous cloud services.

Cons

  • Can present challenges for organizations looking to keep some services on premises.
  • Can become expensive as services accumulate.
  • No trial to test AWS and various components prior to signing up.
  • Some users complain that customer support and documentation are sometimes lacking.
  • Some users find the interface difficult.

Microsoft Azure

At top contender in the cloud market, the Microsoft Azure cloud platform delivers more than 200 services and features, including compute, storage, containers, blockchain, IoT, and AI/ML in public, hybrid and multi-cloud environments. It supports Kubernetes, virtual desktops and numerous open-source resources. Microsoft operates datacenters in more than 200 global locations around the world. This includes 10 in the US.

 Pros

  • Broad global footprint of datacenters.
  • Offers an extensive array of cloud services and applications.
  • Powerful management portal.
  • Highly scalable and flexible.
  • Users give the user interface high marks.

Cons

  • Some users complain that they are unable to customize apps and features as much as they would like.
  • Certain features are complicated and difficult to set up and use compared to other cloud providers.
  • Some user complaints about slow and sometimes inadequate customer support.
  • Pricing and contract flexibility can be difficult.

Google Cloud

Considered the third vendor in the “top three” (along with AWS and Azure), Google Cloud is available globally. It delivers cloud CDN, storage, Kubernetes, streaming analytics, AI/ML, IoT, application modernization, infrastructure modernization, analytics, security and much more. In particular, AI and analytics are strong offerings. Google cloud delivers industry specific solutions for retail, healthcare, media and entertainment, financial services and others. Google operates in more than 200 countries across 27 core regions.

Pros

  • Offers an extensive global infrastructure.
  • Vast portfolio of cloud solutions and products.
  • Excellent interface.
  • Offers an extensive set of APIs and developer tools.
  • Broad partner network.
  • Offers options low carbon infrastructure options. 

Cons

  • Migrations to and from Google Cloud can at times be difficult.
  • The vendor’s extensive catalog of service offerings can overlap and be confusing.
  • Some users find the pricing model inflexible and complain that Google cloud is too expensive.
  • Users report that customer support and documentation are sometimes subpar.
  • Can be difficult to integrate with other cloud services and some on-premises systems.

IBM Cloud

With an extensive array of offerings and features, IBM is among the leaders in the cloud space. It offers public, multi-cloud and hybrid clouds that are designed to tackle an array of functions, including storage, networking, AI/ML, analytics, automation, blockchain, compute, containers, security, the IoT and even Quantum computing. IBM has 60 data centers operating on 5 continents.

Pros

  • Offers more than 170 products that run in the cloud.
  • Extensive partner network.
  • Highly refined industry-specific applications and services.
  • Extensive set of APIs.
  • Powerful security and compliance features.

Cons

  • Some users give IBM low marks for the user interface and overall usability.
  • Reports of stability and compatibility problems.
  • Users report some confusion over licensing arrangements.
  • Can be pricey.
  • Some complaints about slow and inadequate customer support.

Oracle Cloud Infrastructure

Oracle Cloud offers a broad array of integrated public cloud services and applications, including IaaS, PaaS and SaaS on-premises cloud capabilities. This includes compute, storage, networking, analytics, application development, content management and security. In addition, Oracle focuses on services designed specifically for tasks such as ERP, EPM, SCM, marketing and sales. Oracle supports Kubernetes, AI/ML, IoT and other digital technologies. It operates data centers on 6 continents.

Pros

  • Large global footprint, with a dominant legacy in the database market.
  • Highly scalable and flexible. Supports diverse workloads.
  • Operates separate commercial and government datacenters globally.
  • Strong focus on hybrid architectures.
  • Strong support for advanced features such as AI and ML.
  • Offers free training and certification programs.

Cons

  • The focus is heavily on large enterprise and government agencies.
  • Some users report that the platform is challenging to set up and use.
  • Can be expensive. Users report that pricing and contract flexibility can be rigid.
  • Some users object to vendor lock-in issues and the lack of integration with third party services, tools and reporting. 

Dell Technologies

Dell, leveraging a brand name known for both consumer and B2B solutions, offers more than a half dozen major cloud solutions, including public, private and multi-cloud data service options. These services integrate compute, storage, networking and virtualization resources. Dell has over 400 cloud partners and offers solutions specifically designed for Google Cloud and VMware.

Pros

  • Robust ecosystem of cloud solutions.
  • Strong support for multi-cloud environments.
  • Flexible framework, including the ability to manage public cloud features directly from an enterprise datacenter with its APEX Cloud Services.
  • Large portfolio of services available through APEX.
  • Strong support for advanced capabilities, including containers, AI and ML.

Cons

  • Some complaints about difficult-to-use interfaces.
  • Best suited to companies using Dell products and solutions.
  • Potential vendor lock-in issues.
  • Users say that Dell’s support portal can be difficult to use.

HP Enterprise

HPE’s GreenLake edge-to-cloud platform transforms IT into a service consumed on demand. It’s designed around a flexible pay-per-use managed model that’s highly scalable. The cloud platform is designed to support application modernization and data transformation. It supports a broad ecosystem of tools and technologies, including private clouds, virtual machines, containers, AI, machine learning and IoT.

Pros

  • Completely preconfigured environment with customizations available.
  • Highly flexible framework that supports a wide array of compute, storage and networking configurations, including AWS, Azure and Google Cloud.
  • Delivers leading-edge security features, including Silicon Root of Trust, which builds protection in at the firmware/boot level.
  • Consumption-based pricing model.

Cons

  • Reliance on a single vendor for IT services and potential vendor lock-in issues.
  • API framework can be challenging to integrate.
  • Users rate the quality of end-user training lower than competitors.
  • Some find the pricing model too inflexible.

VMware Cloud on AWS

The pioneer in virtualization technology offers VMware Cloud on AWS. The partnership lets organizations deploy hybrid vSphere cloud workloads through the Amazon Web Services platform. This enterprise-class Software-Defined Data Center software framework powers virtual desktop infrastructure (VDI) and allows businesses to run secure cloud services at scale on premises or in the cloud.

Pros

  • Allows organizations to integrate public cloud infrastructure with on-premises systems.
  • Integrates with more than 165 AWS services.
  • Supports bi-directional application migration with no downtime.
  • High ratings from users for a powerful but intuitive interface and features, including common APIs and tools.
  • Highly scalable and flexible.

Cons

  • Can be expensive.
  • Availability of 3rd party resources is sometimes limited.
  • Can’t virtualize all types of workloads.
  • In some cases, a direct migration to a public cloud may prove simpler and less expensive.

Tencent

A major player in the gaming industry, Asian-based Tencent delivers compute, storage, database, CDN, IoT, networking and many other services and applications. These are aimed primarily for video, gaming, online education, websites and other computing frameworks. It supports cloud virtual machines, batch compute, Kubernetes and GPU cloud computing. Tencent has datacenters located across 27 regions in six continents, including 1,100 cache nodes distributed in China. 

Pros

  • High user ratings for reliability, scalability and flexibility.
  • Offers a high level of security.
  • Ease of deployment rated high among users.
  • Highly rated speed and performance.
  • Excellent customer service and support.

Cons

  • Services and resources are heavily tilted toward Asia. This may present challenges for some organizations.
  • Can be challenging to learn and use the interface.
  • Price structure is better suited to larger organizations.
  • Availability of 3rd party resources is sometimes limited.

Alibaba Cloud

A leader in the China market, this fast-growing cloud provider operates in more than 70 countries with over 500 international nodes (plus 2,300 in China), including North America, Europe, Asia, Australia and the Middle East. Alibaba delivers a wide array of cloud services, including elastic compute, ECS bare metal, elastic GPU, dedicated hosting, container service (Kubernetes) and resource orchestration.

Pros

  • Offers a broad array of cloud services and features.
  • High user ratings for reliability.
  • Delivers strong security and compliance features.
  • Powerful enterprise integration capabilities.
  • Users report that the services are easy to manage.
  • May be less expensive than other providers. 

Cons

  • Lacks flexibility for running some services and applications, particularly for edge configurations.
  • Users say that the initial configuration process can be complex and confusing.
  • Some users report that the company’s customer service and documentation are subpar.

Huawei

Huawei Cloud International offers a global cloud footprint with more than 180 cloud services in functional area such as financial service, e-commerce, CRM, project management, IoT and business intelligence. These cloud services include compute, storage, database, networking, bare metal, virtual private clouds and CDNs. Huawei has a presence in more than 170 countries, and it has over 6,000 partners. 

Pros

  • Large global footprint.
  • Highly rated user interface.
  • Has more than 70 global security certifications.
  • Robust support for AI and ML technologies.
  • Strong compliance standards.

Cons

  • Customization is limited with certain services.
  • Some complaints about customer support.
  • Can be expensive.
  • User complaints about limited flexibility in certain services and modules.

The post Top Cloud Service Providers & Companies appeared first on eWEEK.

]]>
Hitachi Vantara’s Radhika Krishnan on Data Fabric and Data Management https://www.eweek.com/big-data-and-analytics/hitachi-vantaras-radhika-krishnan-data-fabric-data-management/ Fri, 10 Sep 2021 21:14:38 +0000 https://www.eweek.com/?p=219471 I spoke with Radhika Krishnan, Chief Product Officer for Hitachi Vantara, about the role of data fabrics, and how data storage and data analytics are merging. Listen to the podcast: Watch the video:   James Maguire on Twitter: https://twitter.com/JamesMaguire eWEEK on Twitter: https://twitter.com/eWEEKNews eWEEK on Facebook: https://www.facebook.com/eWeekNews/ eWEEK on LinkedIn: https://www.linkedin.com/company/eweek-washington-bureau

The post Hitachi Vantara’s Radhika Krishnan on Data Fabric and Data Management appeared first on eWEEK.

]]>
I spoke with Radhika Krishnan, Chief Product Officer for Hitachi Vantara, about the role of data fabrics, and how data storage and data analytics are merging.

Listen to the podcast:

Watch the video:

 

  • James Maguire on Twitter: https://twitter.com/JamesMaguire
  • eWEEK on Twitter: https://twitter.com/eWEEKNews
  • eWEEK on Facebook: https://www.facebook.com/eWeekNews/
  • eWEEK on LinkedIn: https://www.linkedin.com/company/eweek-washington-bureau

The post Hitachi Vantara’s Radhika Krishnan on Data Fabric and Data Management appeared first on eWEEK.

]]>