eWEEK EDITORS, Author at eWEEK https://www.eweek.com/author/eweek-editors/ Technology News, Tech Product Reviews, Research and Enterprise Analysis Tue, 19 Dec 2023 18:00:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.2 Why Cloud Will Remain Dominant: 4 Reasons https://www.eweek.com/cloud/why-cloud-will-remain-dominant/ Wed, 20 Sep 2023 17:18:27 +0000 https://www.eweek.com/?p=223028 Cloud computing is having a moment. No longer an emerging trend, Gartner predicts public cloud service and user spending to grow nearly 22%, totaling more than $597 billion in 2023, up from $491 billion in 2022. Gartner also reports that 75% of organizations will utilize a cloud-driven digital transformation model by 2026. If the massive […]

The post Why Cloud Will Remain Dominant: 4 Reasons appeared first on eWEEK.

]]>
Cloud computing is having a moment. No longer an emerging trend, Gartner predicts public cloud service and user spending to grow nearly 22%, totaling more than $597 billion in 2023, up from $491 billion in 2022. Gartner also reports that 75% of organizations will utilize a cloud-driven digital transformation model by 2026.

If the massive $2 billion commitment by KPMG for investment in Microsoft Cloud and AI services is any indication, the AI arms race will fuel further investment in the cloud platforms required to store and process the large data sets required for AI applications.

As the demand for cloud services continues to rise, there is still growing concern around “cloudflation” driven by the rise in storage costs fueled by higher energy prices – pain that is being felt around the world. And there are also valid concerns about how cloud data centers contribute to higher carbon emissions. This is especially relevant given how climate change is wreaking havoc across the U.S, and the globe.

Why Cloud Will Remain Dominant

There are four key reasons why the cloud will be a dominant force, and continue to be the centerpiece of enterprise and business computing for years to come:

The generative AI-fueled arms race

The economic uncertainty may factor into IT budgets and buying decisions, but this doesn’t seem to be slowing down investment in the cloud. In fact, going forward, generative AI will be a major factor that drives many organization’s tech stack purchase decisions. Whether in the public or private cloud, generative AI is supported by large language models (LLMs) which process data in real time, requiring powerful – and highly scalable – computing power.  When you combine this point with the fact that the best AI solutions are now offered primarily as cloud services, organizations are more motivated than ever to accelerate their transition to the cloud.

Also see: Top Generative AI Apps and Tools

Affordability

Even with rising energy costs, cloud computing is the far more affordable option: And this is true for the largest Fortune 500 enterprises down to SMBs.

Committing to the on-prem model means spending more on IT systems, applications, and hardware infrastructure because it has to keep up with performance objectives, and those can change over time. This can be especially challenging for retail brands and their supply chain partners during holiday sales peaks, for example.

IT must support the increase in performance needs even if only a temporary or seasonally related. A usage-based model is always going to be less expensive than buying an on-prem system or software, which leads to contract lock-ins. As mentioned earlier, the cloud data centers do require energy and water to operate, yet companies are in a better position to meet their ESG and sustainability goals because they’re not using on-site servers to power IT infrastructure.

Application updates, security, and economies of scale

The cloud is better at regular application updates, security, and economies of scale: Despite a recent high profile cloud data breach, overall, cloud platforms offer better security because their reputations as trusted partners depend on it.

For example, the big cloud platforms have hundreds of top-notch engineers, cybersecurity professionals, and IT staff that united in the goal to keep their customer’s data safe. Microsoft, AWS, and Google also have what most small to mid-sized companies do not – economies of scale. They provide seamless application updates, which includes patching applications, and detecting and quashing vulnerabilities before breaches occur.

Companies with strapped or overworked IT staff benefit from the 24/7/365 support that global cloud platform providers offer. From a security, scale, and even a sustainability/ESG standpoint, the cloud is the clear winner.

Also see: Top Cloud Service Providers and Companies

Issues with On-Prem

On-prem isn’t a good fit for the way we work today: While some CEOs seem to be souring on the work from home trend, some form of hybrid work ecosystem will likely continue well into the foreseeable future. The changed 9-5 landscape demands high levels of accessibility for teams that may be in multiple regions, time zones or countries that on-prem software doesn’t offer.

The growth of software-as-a-service (SaaS) over the past five years, boosted heavily by the pandemic, provides the widespread application and network accessibility that companies require to be efficient, productive and remain competitive in today’s business landscape.

Bottom Line: The Cloud and Sustainability

A growing number of companies are making commitments to being better “corporate citizens” and achieving carbon neutrality, which is good news. And even better news is that carbon footprint reduction is top of mind for the Big Three: Microsoft, AWS and Google. All three have confirmed their commitment to sustainability, energy efficiency and the reduction of their carbon footprint. Both Microsoft and AWS plan to power all of their respective data centers with 100% renewable energy by 2025.

Companies don’t have to rely entirely on their providers for a “greener” cloud, they can also make a conscious effort by being more selective about how they consume cloud services, how much data they’re storing unnecessarily, and where their cloud data providers are located.

Despite some downsides to the cloud, it’s still the best strategies for the SMBs to the global Fortune 500 from a productivity, green, and affordability standpoint.

For more information, also see: Digital Transformation Guid

About the Author:

Scott Francis, Technology Evangelist at PFU America, brings more than 30 years of document imaging expertise to his position where he’s responsible for evangelizing Ricoh’s industry leading scanner technology.

The post Why Cloud Will Remain Dominant: 4 Reasons appeared first on eWEEK.

]]>
Reshoring Alleviates Supply Chain Issues – But It Needs Tech to Control Costs https://www.eweek.com/it-management/reshoring-alleviates-supply-chain-issues/ Thu, 10 Aug 2023 19:18:28 +0000 https://www.eweek.com/?p=222848 In the post pandemic world of skill shortages, supply chain disruptions, and geopolitical issues, manufacturers are struggling to operate at full capacity. In a bid to tackle these issues, manufacturers and logistic providers have sought solutions nearer to home – they have “reshored” operations. Reshoring’s primary goal is to regain control over the entire end-to-end […]

The post Reshoring Alleviates Supply Chain Issues – But It Needs Tech to Control Costs appeared first on eWEEK.

]]>
In the post pandemic world of skill shortages, supply chain disruptions, and geopolitical issues, manufacturers are struggling to operate at full capacity. In a bid to tackle these issues, manufacturers and logistic providers have sought solutions nearer to home – they have “reshored” operations.

Reshoring’s primary goal is to regain control over the entire end-to-end supply chain—it’s about manufacturing products on local soil, and it’s a process that’s been gaining traction from companies worldwide.

From a North American perspective, the picture is no different. Many U.S. companies have begun the shift away from globalization as default, with research suggesting that nearly 350,000 jobs were re-shored to the U.S. in 2022—a notable increase when compared to the 2021 figure of 260,000.

The movement has also seen companies become less reliant on China. Now, many economies, including the U.S., India, and the European Union, are looking to establish a roadmap that will balance supply chains and increase resiliency. The China Plus One Strategy is an approach adopted by a number of businesses looking to include sourcing from other destinations. Already, numerous companies have turned to Vietnam and India as alternatives, with both countries reporting an uptick in investment from U.S. companies that have built plants there.

According to the Reshoring Initiative IH 2022 Data Report, supply chain gaps, the need for greater self-sufficiency, and a volatile geopolitical climate are major factors driving reshoring. The report found that 69% of companies cited supply chain disruptions as the primary reason for reshoring.

There is now movement on a national level to strengthen supply chains and promote domestic manufacturing with the introduction of the bipartisan National Development Strategy and Coordination Bill in December 2022. This bill highlights the importance of manufacturing reshoring to national economic development going forward into 2023.

Sustainability and Tech in Reshoring

Recent research commissioned by IFS, polling senior decision-makers working for large enterprises globally, found that 72% have increased their usage of domestic suppliers, compared to international suppliers.

From a sustainability perspective, there are huge benefits to be gained. In fact, reshoring is giving manufacturers a golden opportunity to look hard at their manufacturing processing and how they can develop more sustainable processes.

For example, it can minimize CO2 emissions as transport is reduced and spur a deduction in wasteful overproduction as supply chains are brought closer together. As the whole world strives to act more sustainably in the race to net-zero, environmental benefits will play a huge role in driving new sourcing strategies.

However, the raw materials, components, and products that they source from suppliers are likely to become more expensive, especially as inflation continues to gather pace globally. As a result, 53% have considered increasing the proportion of materials/components they produce in-house. But again, these measures and others like them that organizations are now taking to mitigate risk are likely to add cost, complexity, and waste to the supply chain.

Therefore, reshoring is not the silver bullet to mitigating supply chain disruption entirely. Often, companies underestimate the sheer level of effort, costs, and logistical planning required to make reshoring a success.

But for many U.S. companies, the extra costs to manufacture within the country are definitely outweighed by the savings in customs and shipping costs and the additional sustainability benefits associated with offshore operations.

It’s here organizations need the helping hand of technology—in fact, it can be a key facilitator for solving supply chain, labor, and production challenges associated with reshoring.

For 94% of respondents in a recent McKinsey study, Industry 4.0 helped keep operations running during the COVID-19 pandemic, with another 56% claiming Industry 4.0 technologies had been critical for efficient responses.

A new IDC InfoBrief, sponsored by IFS and entitled Shaping the Future of Manufacturing, shows an active correlation between digital maturity and profit. According to the research, manufacturers reporting an optimized level of digital transformation saw profits increase 40%, while those with less advanced digital transformation maturity suffered bigger reductions in profit in the last fiscal year.

Tech has been quick to respond to the call to deliver the agility and fast “Time to Insight” (TTI) that manufacturers need to better forecast demand and provide a more detailed view of sustainability across product supply chains. Exceptional supply chain management will be a vital part of the move to reshoring. The IFS study showed supply chain management was now seen by 37% of respondents as one of the top three priorities their organization is trying to solve through technology investment.

Reshoring in Action: Will the Benefits Be Worth It?

In a recent Kearney index on manufacturing reshoring, 92% of executives expressed positive sentiments toward reshoring. And that’s no surprise when you consider the additional benefits on offer. As well as a more protected supply chain ecosystem, there are also positive societal benefits from the move to reshoring.

According to the U.S. Reshoring Initiative, in 2021 the private and federal push for domestic U.S. supply of essential goods propelled reshoring and foreign direct investment (FDI) job announcements to a record high.

From a broader perspective, there are many profitable and supply chain benefits at stake for manufacturers. For example, research found that 83% of consumers in the U.S. are willing to pay 20% more for American-made products, with another 57% claiming that the origin of a product would sway their purchasing decision.

From a management standpoint, control over operations has significantly increased. Bringing operations all to one centralized location gives businesses tighter control over processes. Manufacturers will also benefit from shorter supply chains as much of today’s manufacturing is spurred by IoT, AI, and machine learning capable of performing monotonous tasks around the clock.

On a day-to-day level, on-site teams will experience increased collaboration as reshoring drastically reduces the time difference between headquarters and the manufacturing plant.

Tech Needs to Drive Reshoring

It’s easy to see why the appeal of reshoring is prompting a move toward U.S.-based manufacturing initiatives. By addressing reshoring now with the right technology, efficiently and cost-effectively, manufacturers will put themselves in a great position to not only survive but also thrive long into the future.

Of course, as with any major transformation, there are hurdles to overcome. But the long-term results of reshoring, from increased employment to tighter manufacturing control, look as though it’s a journey worth embarking on. As more and more companies around the world look to reshore operations on home soil, manufacturers will need the guiding hand of a flexible and agile software platform to make reshoring a reality at scale.

About the Author:

Maggie Slowik is the Global Industry Director for Manufacturing at IFS.

Featured Partners: IT Software

Wrike

Visit website

Wrike is an IT work management software trusted by 20,000+ companies and over two million users. Streamline your IT management using custom request forms, Kanban boards, Gantt charts, time-tracking, automated workflows and approvals, budget management, and advanced reporting, all in one place. Integrate Wrike with 400+ applications such as GitHub for seamless development and tracking. Customize your workflows so you can see progress at every step. Transform your IT management with Wrike.

Learn more about Wrike

Zoho Assist

Visit website

Zoho Assist is a premium remote support tool tailored to the needs of IT professionals. This IT software solution enables unified support and efficient service management. It's versatile, compatible with various workplace environments, and easy to integrate with top help desk and live chat tools. With industry-standard features, Zoho Assist empowers organizations to optimize their IT support, work efficiently, and elevate client service standards.

Learn more about Zoho Assist

Site24x7

Visit website

Site24x7 offers unified cloud monitoring for DevOps and IT operations, and monitors the experience of real users accessing websites and applications from desktop and mobile devices. In-depth monitoring capabilities enable DevOps teams to monitor and troubleshoot applications, servers and network infrastructure, including private and public clouds. End-user experience monitoring is done from more than 110 locations across the world and various wireless carriers.

Learn more about Site24x7

NinjaOne

Visit website

NinjaRMM is NinjaOne’s powerful easy-to-use RMM, offering all the features, flexibility, and power MSPs need in a fast-to-setup, easier-to-use package. NinjaRMM gives you complete visibility into and control over your Windows, Mac, and Linux servers, workstations and laptops as well as virtual machines, and networking devices. Our centralized, policy-based management approach puts automation at the center of your endpoint management strategy. NinjaRMM is built for the way MSPs work.

Learn more about NinjaOne

The post Reshoring Alleviates Supply Chain Issues – But It Needs Tech to Control Costs appeared first on eWEEK.

]]>
5 Mistakes to Avoid In a Data Storage Refresh https://www.eweek.com/big-data-and-analytics/data-storage-refresh/ Wed, 09 Aug 2023 17:59:46 +0000 https://www.eweek.com/?p=222824 As data storage technology has evolved with more choice and options for different use cases—the flavor of today is AI-ready storage—determining the right path for a data storage refresh requires a data-driven approach. Decisions for new data storage must also factor in user and business needs across performance, availability and security. Forrester found that 83 […]

The post 5 Mistakes to Avoid In a Data Storage Refresh appeared first on eWEEK.

]]>
As data storage technology has evolved with more choice and options for different use cases—the flavor of today is AI-ready storage—determining the right path for a data storage refresh requires a data-driven approach.

Decisions for new data storage must also factor in user and business needs across performance, availability and security. Forrester found that 83 percent of decision-makers are hampered in their ability to leverage data effectively due to challenges like outdated infrastructure, teams overwhelmed and drowning in data, and lack of effective data management across on-premises and cloud storage silos. Leveraging cloud storage and cloud computing, where AI and ML technologies are maturing fastest, is another prime consideration.

Given the unprecedented growth in unstructured data and the growing demand to harness this data for analytical insight and AI, the need to get it right has never been more essential. This article provides guidance on that topic by highlighting what not to do when performing a data storage refresh.

Also see: Top Cloud Companies

Mistake 1: Making Decisions without Holistic Data Visibility

When IT managers discover that they need more storage, it’s easy to simply buy more than they need. But this may lead to waste and/or the wrong storage technology later.

A majority (80%) of data is typically cold and not actively used within months of creation yet consumes expensive storage and backup resources. Plus, given that you can now purchase additional storage instantly and on-demand in the cloud and with storage-as-a-service on-premises, there’s no reason to overprovision.

To avoid this common conundrum, get insights on all your data across all storage environments. Understand data volumes, data growth rates, storage costs and how quickly data ages and becomes suitable for archives or a data lake for future data analytics.

These basic metrics can help guide more accurate decisions, especially when combined with a FinOps tool for cost modeling different options. The need to manage increasing volumes of unstructured data across multiple technologies and environments, for many different purposes, is leading to data-centric rather than storage-centric decision-making across IT infrastructure.

Mistake 2: Choosing One-Size-Fits-All Storage

Storage solutions come in many shapes and forms – from cloud object storage to all-Flash NAS, scale-out on-prem systems, SAN arrays and beyond. Each type of storage offers different tradeoffs when it comes to cost, performance and security.

As a result, different workloads are best supported by different types of storage. An on-prem app that processes sensitive data might be easier to secure using on-prem storage, for instance, while an app with highly unpredictable storage requirements might be better suited by cloud-based storage that can scale quickly.

This again points to the need to analyze, segment and understand your data. The ability to search across data assets for file types or metadata tags can identify data and better inform its management. Avoid the one-size-fits-all approach by provisioning multiple types of storage solutions that reflect your different needs.

Also, less than 25% of data costs are in storage: the bulk of the costs are in the ongoing backup, disaster recovery and protection of the data. So, consider the right storage type and tier as well as the appropriate data protection mechanisms through the lifecycle of data.

For more information, also see: Best Data Analytics Tools

Mistake 3: Becoming Locked into One Vendor

Acquiring all your storage from one vendor may be the simplest approach, but it’s almost never the most cost-effective or flexible.

You can likely build more cost-effective storage infrastructure if you select from the offerings of multiple vendors. Doing so also helps protect you against risks like a vendor’s decision to raise its prices substantially or to discontinue a storage product you depend on.

If you have other vendors in the mix, you can pivot more easily when unexpected changes occur. Using a data management solution that is independent of any storage technology is also a way to prevent vendor lock in, by ensuring that you can move data from platform to platform without the need to rehydrate it first.

Mistake 4: Moving Too Fast

A sense of urgency tends to accompany any major IT migration or update, storage refreshes included. Yet, while it’s good to move as efficiently as you can, it’s a mistake to move so fast that you don’t fully prepare for the major changes that accompany a storage refresh.

Instead, take time to collect the data you need to identify the greatest pain points in your current storage strategy and determine which changes to your storage solutions will deliver the greatest business benefits. Be sure, too, to collect the metrics you need to make informed decisions about how to improve your data management capabilities.

Mistake 5: Ignoring Future Storage Needs

You can’t predict the future, but you can prepare for it by anticipating which new requirements your storage solutions may need to support in the future. At present, trends like AI, sustainability and growing adoption of data services mean that the storage needs of the typical business today are likely to change in the coming year.

To train AI models, for example, you may need storage that can stream data more quickly than traditional solutions. Likewise, implementing data services in order to support FinOps goals might mean finding ways to consolidate and share storage solutions more efficiently across different business units.

Conclusion: The Importance of a Storage Refresh

As organizations move from storage-centric to data-centric management, IT and storage architects will need to change the way they evaluate and procure new storage technologies.

The ability to analyze data to make nuanced versus one-size-fits-all storage decisions will help IT organizations navigate many changes ahead – be they cloud, edge, AI or something else still on the horizon.

Read next: What is Data Visualization

About the author:

Krishna Subramanian is COO, President & Cofounder of Komprise.

Featured Partners: BI Software

Domo

Visit website

Domo puts data to work for everyone so they can multiply their impact on the business. Underpinned by a secure data foundation, our cloud-native data experience platform makes data visible and actionable with user-friendly dashboards and apps. Domo helps companies optimize critical business processes at scale and in record time to spark bold curiosity that powers exponential business results.

Learn more about Domo

Yellowfin

Visit website

Yellowfin’s intuitive self-service BI options accelerate data discovery and allow anyone, from an experienced data analyst to a non-technical business user, to create reports in a governed way.

Learn more about Yellowfin

Wyn Enterprise

Visit website

Wyn Enterprise is a scalable embedded business intelligence platform without hidden costs. It provides BI reporting, interactive dashboards, alerts and notifications, localization, multitenancy, & white-labeling in any internal or commercial app. Built for self-service BI, Wyn offers limitless visual data exploration, creating a data-driven mindset for the everyday user. Wyn's scalable, server-based licensing model allows room for your business to grow without user fees or limits on data size.

Learn more about Wyn Enterprise

Zoho Analytics

Visit website

Finding it difficult to analyze your data which is present in various files, apps, and databases? Sweat no more. Create stunning data visualizations, and discover hidden insights, all within minutes. Visually analyze your data with cool looking reports and dashboards. Track your KPI metrics. Make your decisions based on hard data. Sign up free for Zoho Analytics.

Learn more about Zoho Analytics

Sigma

Visit website

Sigma delivers real-time insights, interactive dashboards, and reports, so you can make data-driven decisions on the fly. With Sigma's intuitive interface, you don't need to be a data expert to dive into your data. Our user-friendly interface empowers you to explore and visualize data effortlessly, no code or SQL required.

Learn more about Sigma

The post 5 Mistakes to Avoid In a Data Storage Refresh appeared first on eWEEK.

]]>
How Cloud Cost Optimization Can Help Solve the Tech Hiring Crisis https://www.eweek.com/cloud/how-cloud-cost-optimization-can-help-solve-the-tech-hiring-crisis/ Sat, 22 Jul 2023 21:38:42 +0000 https://www.eweek.com/?p=222760 For workers outside the tech sector, the Great Resignation is over. Voluntary employee resignations have receded nearly back to pre-pandemic levels, and employers no longer report the deep hiring and retention challenges for most roles that they faced a year or two ago. For technology jobs, however, the story is different. As Gartner notes, “the […]

The post How Cloud Cost Optimization Can Help Solve the Tech Hiring Crisis appeared first on eWEEK.

]]>
For workers outside the tech sector, the Great Resignation is over. Voluntary employee resignations have receded nearly back to pre-pandemic levels, and employers no longer report the deep hiring and retention challenges for most roles that they faced a year or two ago.

For technology jobs, however, the story is different. As Gartner notes, “the tech talent crunch is far from over.” Demand for technology talent remains much higher than the supply, a trend that Gartner doesn’t expect to change until at least 2026.

This means that if you help manage a business that depends on workers with technology expertise, developing strategies for hiring and retaining tech workers remains critical. Without sufficient tech workers on staff, you risk falling behind because your business can’t build and maintain the systems it needs to remain competitive.

I can’t claim to have all of the answers on how to hire and retain skilled tech workers, but I do have one key suggestion: Optimizing your cloud spending.

Although reducing cloud spending may not seem like it would have a major direct impact on companies’ ability to find and retain skilled technology workers, the relationship between cloud cost optimization and hiring success runs deeper than you might think.

Allow me to explain by discussing the factors behind ongoing tech hiring challenges, as well as how cloud cost optimization can help reduce them.

Also see: Top Cloud Companies

What’s Causing the Tech Hiring Crisis?

The main reason why it remains so hard to find and retain effective technology workers is simple, at least at a high level: There is a shortage of individuals with the right skills across the market, and companies have more roles in areas like software development and IT to fill than candidates to fill them.

But if you dive deeper into the issue, you realize that the problem is about more than just misalignment between worker supply and demand. It also involves changing expectations among tech employees with regard to the technology they work with. Today’s engineers want to work with the latest, greatest technologies to build cool things, and companies that aren’t in a position to allow them to do that will struggle with hiring.

Salary, too, is part of the story. Salaries have been on the rise across the board, making it harder for businesses with tighter budgets to compete for tech talent.

The fact that the economy has been turbulent over the past year has also complicated matters. Higher borrowing rates and economic uncertainty mean that the typical organization faces much more pressure today than it did about a year ago to rein in costs. And because hiring freezes and layoffs are one of the easiest ways to cut costs quickly, the current economy has increased employees’ anxieties about losing their jobs or having their teams’ headcount reduced.

This creates a recipe for employees with in-demand technology skills to be on the lookout for new job opportunities, making it harder for companies who do manage to hire effective employees to retain them.

Also see: Cloud Native Winners and Losers

Cloud Cost Optimization as a Tool for Tech Hiring and Retention

Now, let’s talk about how optimizing your cloud spending can help you conquer the challenges I’ve just listed.

When you cut cloud spending, you gain two critical advantages that translate to an enhanced ability to hire and retain skilled tech workers:

  • You free up financial resources that you can redirect toward personnel costs. If you spend less in the cloud, you have more to spend on employee salary, which means you’re likely to have more success competing for the limited supply of workers.
  • Cloud cost optimization initiatives tend to drive technology innovation because companies adopt newer technologies to help reduce their cloud spending while improving performance. As a result, businesses get the latest, greatest tech – which is exactly what skilled employees want to be working with.

In short, cloud cost optimization makes it easier to pay good employees more for the work they do, while also giving them more interesting work to do because they get to build and run more modern systems.

Let me emphasize, too, that you don’t need to slash your cloud spending by 20 or 30 percent to achieve these benefits. Even if you cut just a few hundred thousand dollars per year from your budget, that could easily be enough to increase engineer salaries by perhaps 25 percent – enough to give you a major edge against competitors when it comes to recruiting talent.

Plus, having a culture of cost optimization and cloud platform modernization helps to reduce anxieties among current employees about the direction of the company. When employees know that business leadership is making smart decisions about the technology it uses and how it uses it, they’re less likely to go exploring opportunities elsewhere.

Bottom Line

I’m not saying, of course, that optimizing cloud spending will solve all of your hiring and retention woes all on its own. But the more effectively you use your budget in the cloud, and the more modern the cloud technologies and platforms you adopt, the better positioned you are to move the needle when it comes to recruiting and keeping skilled workers.

Also see: Top Digital Transformation Companies

About the author:

Willy Sennott is the EVP of FinOps at Vega Cloud. He has 25+ years’ experience in the financial, marketing and business analytics data, helping clients and companies drive revenue growth, improve cost efficiencies and effectively allocate capital. At Vega Cloud, Sennott leads the FinOps practice and helps drive overall company strategy and product roadmap. 

Featured Partners: Cloud Backup and Storage Software

ManageEngine RecoveryManager Plus

Visit website

RecoveryManager Plus is an integrated backup and recovery solution for your Exchange Online, on-premises Exchange, and Google Workspace mailboxes. Backup and restore all items in your mailboxes, including all attachments. Export entire Exchange Online and on-premises Exchange mailboxes or just a part of it as a PST file and secure them with a password for an additional layer of security. Try free for 30 days!

Learn more about ManageEngine RecoveryManager Plus

BDRSuite

Visit website

BDRCloud, a cloud-based service from BDRSuite offers secure, reliable, and scalable cloud backup solutions for Microsoft 365, Google Workspace, Servers, Endpoints, and Applications. Seamlessly backup and restore critical data from the BDRCloud at your convenience. With customizable policies for scheduling, retention, backup encryption, and multiple recovery options, BDRCloud ensures your data is always secure and accessible.

Learn more about BDRSuite

The post How Cloud Cost Optimization Can Help Solve the Tech Hiring Crisis appeared first on eWEEK.

]]>
Generative AI in PR and Marketing: 3 Core Issues https://www.eweek.com/artificial-intelligence/generative-ai-in-pr-and-marketing-3-issues/ Wed, 21 Jun 2023 19:54:15 +0000 https://www.eweek.com/?p=222612 ChatGPT, the AI-powered chatbot, launched by San Francisco-based OpenAI has taken the world of B2B Marketing and PR by storm. Amassing over 100 million users, the platform has out-performed the likes of TikTok and Instagram and placed itself firmly at the top spot as the fastest growing web platform, ever. So, where do B2B communications […]

The post Generative AI in PR and Marketing: 3 Core Issues appeared first on eWEEK.

]]>
ChatGPT, the AI-powered chatbot, launched by San Francisco-based OpenAI has taken the world of B2B Marketing and PR by storm. Amassing over 100 million users, the platform has out-performed the likes of TikTok and Instagram and placed itself firmly at the top spot as the fastest growing web platform, ever.

So, where do B2B communications sit in all the hype and glory of ChatGPT? A wake-up call or are the doomsayers right? Well, PR and Marketing have three issues that will affect the Generative AI game in the coming years.

The generative AI race is well and truly on – and it’s certainly got tongues wagging. Research into how professionals will look to use ChatGPT revealed mixed opinions – some wish to avoid the platform, some have already made good use. Let’s take a look at the three key issues.

Also see: Top Generative AI Apps and Tools

1) So, Who’s Copying Who?

The media is all over ChatGPT, but as B2B PR and Marketing professionals, our attention should be placed on human generated copy and how marketing-speak has grown into what already sounds like computer-generated discourse.

The issue isn’t that machines write like humans – it’s that humans are beginning to write like machines. ChatGPT should serve as a wakeup call for PR and Marketing professionals to stop writing in marketing lingo and start using words to convey ideas and thoughts.

From an infinite number of monkeys writing Shakespeare, to the WSJ’s first Buzz Word Generator, to Chat GPT, artificial intelligence has gotten better, but actually not fundamentally changed in its basic capabilities.

ChatGPT is the ultimate wordsmith – we’ve all read press releases and articles that spew out words that sound compelling but say nothing – ‘I see the words, but what do they mean’ is a phrase often used in my company.

Many writers, bloggers, and content creators are producing copy with no interest in the subject of the copy. A machine can do that, and do it rather well!

Also see: Generative AI Companies: Top 12 Leaders

2) There’s Still Some Things ChatGPT Can’t Do – and Won’t Ever

PR and Marketing agencies must be more than just wordsmiths. Quality writing is fueled by intention – we are trying to deliver subliminal corporate messaging in our press releases that gets across more than just a product launch. Coherence isn’t enough, communication is more complex and precise.

B2B professionals bring unique skills, perspectives, and relationships that cannot be replaced by AI. Often a single piece of content needs to support a number of different precisely targeted audiences – an editor, a buyer, and a C-level ratifier. Try telling that to ChatGPT!

The tool can assist with many tasks but there are three essential components of effective PR and Marketing: creativity, critical thinking, and the emotional intelligence that ChatGPT lacks.

We humans are born to think outside the box, to come up with completely new and original ideas. Thinking outside the box is impossible for ChatGPT – it is the box.

These limitations highlight the complementary nature of AI and human B2B professionals. AI can perform certain tasks faster and more efficiently, but the human brain brings a unique skillset that is critical to effective Marketing and PR practices.

Critical thinking is fundamental to understand causes from correlations, understand where bias is and remove it, and distinguish between a primary source and someone’s personal opinion. We know the distinction between our truth versus their truth gets muddier by the day, but the human brain can figure it out!

Selling new and original developments and solutions require targeting copy at different audiences with different needs. This requires critical thinking – something robots can’t do. AI chatbots can’t ‘read into a situation.’ Our human emotional intellect makes us able to understand and handle an interaction or debate that needs more emotional communication methods.

But emotional intelligence isn’t all ChatGPT is lacking.

3) Staying Legal and Ethical – Complications Ahead

Trust is a critical aspect of ChatGPT for the user to believe that their generated text is factually correct. With people and machines creating tens of millions of new web pages daily, will using machine-generated content be pivotal in enabling your organization to stand out from the rest, or cause copyright troubles?

Google’s position on AI-produced content is clear – companies that use AI-generated content to manipulate ranking in search results will violate their spam policies. So, where should B2B PR and Marketers stand?

Clearly trust is ChatGPT’s biggest weakness. Unlike Google, you don’t know the source of the information, you can’t judge based on the type of site or the experience of the author. Google’s system of basing quality on the number of citations of an article isn’t in place. Further research is going to be needed and this will take time, so people will still return to trusted sources and expertise.

The U.S. Copyright Office has now launched a new initiative to examine the copyright law and policy issues raised by artificial intelligence (AI), including the scope of copyright in works generated using AI tools and the use of copyrighted materials in AI training. The Copyright Office says this initiative has been launched “in direct response to the recent striking advances in generative AI technologies and their rapidly growing use by individuals and businesses.” And this begins right at the input stage, not the output.

ChatGPT and similar software use existing text, images, and code to create ‘new’ work. The technology must get its ideas from somewhere, which means trawling the web to ‘train’ and ‘earn’ from pre-existing content. OpenAI and similar alternatives have already been subject to many lawsuits, arguing that AI tools are illegally using other people’s work to build their platforms.

With the PR Council also weighing in on this issue, all we can do is wait for official guidance and standards on the use of AI in PR. For now, communications pros are urged to apply caution to any external-facing use of output from ChatGPT.

On a related topic: The AI Market: An Overview 

It’s Time to Work Together – Not Against!

B2B PR and Marketing professionals can undoubtedly reap many benefits from the phenomenon that is OpenAI’s ChatGPT technology – from providing valuable data-driven insights to streamlining repetitive tasks – and it certainly presents a prime opportunity to keep innovating as the technology matures.

B2B pros should treat AI as a complementary tool to achieve a higher level of consumer engagement.

Good PR and Marketing Pros must foster imagination and creativity, strategic and critical thinking, and emotional intelligence to ensure their strategies and their content stay ahead of the competition.

About the Author: 

Judith Ingleton-Beer, CEO of IBA International

The post Generative AI in PR and Marketing: 3 Core Issues appeared first on eWEEK.

]]>
5 Ways Event-Driven Architecture (EDA) Unlocks the Potential of ChatGPT https://www.eweek.com/artificial-intelligence/5-ways-event-driven-architecture-unlocks-the-potential-of-chatgpt/ Thu, 15 Jun 2023 00:05:03 +0000 https://www.eweek.com/?p=222581 From instant translations and idea generation to composing emails and essays from scratch, ChatGPT is beginning to filter into our everyday lives. According to a USB study, the chatbot reached 100 million monthly active users in January, just two months after launch, making it the fastest-growing consumer application in history. There are, however, some drawbacks […]

The post 5 Ways Event-Driven Architecture (EDA) Unlocks the Potential of ChatGPT appeared first on eWEEK.

]]>
From instant translations and idea generation to composing emails and essays from scratch, ChatGPT is beginning to filter into our everyday lives. According to a USB study, the chatbot reached 100 million monthly active users in January, just two months after launch, making it the fastest-growing consumer application in history.

There are, however, some drawbacks and limitations that are keeping it and AI in general from achieving full potential. This is where event-driven architecture, EDA, can facilitate the flow of information between the systems that “publish” events and the other systems that indicate interest in that kind of information by “subscribing” to topics.

Building applications with EDA is a perfect way to tie internal features together and make them more responsive. This means EDA absorbs requests and services them when ChatGPT is invoked, helping to improve response times, cutting down on unnecessary energy consumption, and even providing new ecommerce opportunities for B2B and B2C businesses. Here’s how.

Also see: Top Generative AI Apps and Tools

5 Ways EDA Unlocks the Potential of ChatGPT

1) No Questions Asked! Enable Automatic Answers by Streamlining the Request and Response Cycle

Today ChatGPT operates in what us techies call a “request/reply” way. Ask and ye shall receive, you might say. So now imagine if ChatGPT could proactively send you something it knows you’d be interested in!

For example, say you use ChatGPT to summarize and note action items from a Zoom meeting with a dozen participants. Instead of each participant raising a query, EDA would allow ChatGPT to send the notes to all attendees at the same time, including those who missed the meeting.

Everyone would be automatically and instantly up-to-date on meeting outcomes, requiring significantly less load on ChatGPT since it proactively sends one message to a dozen recipients instead of satisfying a bunch of request/reply interactions over time, thereby improving service levels for users.

Any group activity needing the same suggestions, facilitated by ChatGPT, can benefit from this capability. For instance, teams working jointly on a codebase. Rather than ChatGPT suggesting changes/improvements to every developer in their IDE, users would have the IDE “subscribe” to suggestions and then the underlying EDA technology would be able to push it out to all subscribed developers when they launch the codebase.

On a related topic: What is Generative AI?

2) Reduce ChatGPT’s Energy Consumption with Intelligent Resource Utilization

ChatGPT is very resource-intensive, therefore expensive, from a processing/CPU perspective, and requires special chips called graphical processing units (GPUs). And it uses quite a lot of them. The extensive GPU workload (now estimated to be upwards of 28,936) required to train the ChatGPT model and process user queries incurs significant costs, estimated to be between $0.11 to $0.36 per query.

And let’s not overlook the environmental costs of the model. The high power consumption of GPUs contributes to energy waste, with reports from data scientists estimating ChatGPT’s daily carbon footprint to be 23.04 kgCO2e, which matches other large language models such as BLOOM.

However, the report explains “the estimate of ChatGPT’s daily carbon footprint could be too high if OpenAI’s engineers have found some smart ways to handle all the requests more efficiently.” So, there is clearly room for improvement on that carbon output.

By implementing EDA, ChatGPT can make better use of its resources by only processing requests when they are received, instead of running continuously.

Also see: 100+ Top AI Companies 2023

3) Eliminate ChatGPT Unavailability When at Capacity

ChatGPT needs to handle a high volume of incoming requests from users. The popularity, rapid growth, and unpredictability of ChatGPT means it is frequently overwhelmed as it struggles to keep up with demand that can be extremely volatile and what we call ‘bursty.’

Today this leads to “sorry can’t help you” error messages for both premium and free ChatGPT users. These recent ChatGPT outages indicate how saturated the system is becoming as it struggles to rapidly scale-up to meet its ever-increasing traffic and compete with new rivals such as Google Bard.

So where does EDA come in?

In the event of a ChatGPT overload, implementing EDA can buffer requests and service them asynchronously across multiple event-driven microservices as the ChatGPT service becomes available. With decoupled services, if one service fails, it does not cause the others to fail.

The event broker, a key component of event-driven architecture, is a stateful intermediary that acts as a buffer, storing events and delivering them when the service comes back online. Because of this, service instances can be quickly added to scale because it doesn’t result in downtime for the whole system — thus, availability and scalability are improved.

With EDA assistance, users of ChatGPT services across the globe can ask for what they need at any time, and ChatGPT can send them the results as soon as they are ready. This will ensure that users don’t have to re-enter their query to get a generative response, improving overall scalability and reducing response time.

Also see: ChatGPT vs. GitHub Copilot 

4) Integrate ChatGPT into Business Operations to Disrupt the AI E-Commerce Marketplace

AI plays a critical role in the e-commerce marketplace – in fact, it is projected that the e-commerce AI market will reach $45.72 billion by 2032. So, it’s no surprise that leading e-commerce players are trying to figure out how to integrate ChatGPT into their business operations. Shopify, for instance, has developed a shopping assistant with ChatGPT that is capable of recommending products to users by analyzing their search engine queries.

EDA has the potential to enhance the shopping experience even further and help B2C and B2B businesses learn more about their customers. It does this by tracking key events at high volume from e-commerce platforms to help businesses understand patterns in customer behavior, such as what items are the most profitable in certain regions and what factors influence purchasing decisions.

This information can be then sent to a datastore for the ChatGPT machine learning model to predict customer behavior and make personalized product recommendations. This is only the beginning of the development of these sorts of models based on ChatGPT.

5) Improve Responsiveness for Your Global User Base

Since ChatGPT and ChatGPT apps have a global userbase, you would want to efficiently distribute data from your GPT queries. An event mesh is the perfect architecture to satisfy this demand.

An event mesh is an architecture layer composed of a network of event brokers that allows events from one application to be routed and received by any other application regardless of where they are deployed. Through this, you could dynamically route data on an on-demand basis to interested subscribers rather than sending your ChatGPT results to all applications and have application logic to filter it out. This results in a better user experience and saves on compute/network resources.

Also see: ChatGPT vs. Google Bard: Generative AI Comparison 

Unleash the Full Potential of ChatGPT with EDA

ChatGPT may still be in its infancy but with its rapid user adoption and regular new feature announcements, it seems that the story is far from over. Whether it is used to address service outages and excessive energy consumption; enable greater scalability, resilience and flexibility; or bring new business use cases to B2B and B2C organizations, EDA has the capacity to help this new generative AI tool build on its newfound success.

About the Author: 

Thomas Kunnumpurath is Vice President of Systems Engineering for Americas at Solace

The post 5 Ways Event-Driven Architecture (EDA) Unlocks the Potential of ChatGPT appeared first on eWEEK.

]]>
Interoperability: Thriving in Uncertain Times https://www.eweek.com/cloud/interoperability-thriving-in-uncertain-times/ Wed, 14 Jun 2023 18:52:01 +0000 https://www.eweek.com/?p=222521 From the pandemic to supply chain volatility, economic uncertainty and inflation—companies have faced an unprecedented number of black swan events over the past few years. To be successful, they need the ability to quickly integrate new technologies, people and processes so they can pivot their business on a dime and navigate to changing conditions. Of […]

The post Interoperability: Thriving in Uncertain Times appeared first on eWEEK.

]]>
From the pandemic to supply chain volatility, economic uncertainty and inflation—companies have faced an unprecedented number of black swan events over the past few years. To be successful, they need the ability to quickly integrate new technologies, people and processes so they can pivot their business on a dime and navigate to changing conditions.

Of course, this quick integration is no easy task. Especially considering that over the last two years, one in two companies rapidly adopted new technologies and transformed their business in record time, according to new research from Accenture. In fact, the average company now has well over 500 software applications, from almost as many vendors, with 81 percent planning to add more over the next two years.

At a time when budgets are tightening, enterprises now need to focus on untangling their applications to ensure they work cohesively to enable agility and provide ongoing business value.

Also see: Top Cloud Service Providers and Companies

Cracking the Code

We’ve found that one in three companies have cracked the code and have managed to make their enterprise technologies work together.

Last year, these companies with high interoperability grew revenue six times faster than their peers with low operability and they are poised to unlock an additional five percentage points in annual revenue growth. This is a huge financial advantage.

To put it in perspective, if two organizations start with $10 billion in revenue today, the organization with high interoperability stands to make $8 billion more than its peer with low interoperability over the next five years.

So, how does high interoperability create such a powerful impact? By integrating enterprise applications, businesses can move from siloed technologies to connected solutions that enable better data sharing, enhanced employee productivity and improved customer experiences.

GN Group, a global audio solution manufacturer, is a prime example of how high interoperability can enable organizations to take advantage of opportunity. Following a 42 percent rise in headset sales in 2020, the company braced for further demand surges due to remote school and work. When sales jumped 82 percent in the first quarter of 2021, company leaders knew they needed to unite employees and technology under a common strategy to meet the increased demand—and fast.

They turned to Microsoft’s cloud-based enterprise solutions to connect functional applications – like supply chain operations and finance – so that employees across the organization could make decisions based on a single source of trusted data.

In this way, the sales team could check if procurement had the available components for a large incoming order. Similarly, vendors and suppliers, who were previously late to learn of new demand, could also make informed inventory decisions.

By removing data silos and creating a common language across critical applications and systems, GN enabled parallel, rapid transformation in multiple business areas.

Also see: Top Digital Transformation Companies

Long-Term Value Without a Big Price Tag

The concept of interoperability isn’t new, but the ability to manifest it is. As companies move to the cloud and have access to improved and inexpensive applications, interoperability not only becomes a source of long-term value—it becomes low-cost too.

Leading companies are achieving high interoperability with just 2-4 percent higher IT and functional budgets directed at applications. The investment is helping them outperform their peers across industries and economic cycles.

Consider life sciences, an industry that grew rapidly with the global demand for vaccinations. Companies with high interoperability grew revenue by almost 10 percent on average, while those with low/no operability only managed a five percent gain.

In the travel industry, which was hit particularly hard by the pandemic, saw revenue decline by four percent, on average, in low operability companies. In contrast, high interoperability companies were able to quickly pivot their business models and grow revenue by two percent.

For more information, also see: Digital Transformation Guid

The Three C’s

Across industries, interoperability is a common denominator for success. There are three best practices for getting to high interoperability in an era of compressed transformation.

  1. Leverage the Cloud: By moving existing applications to the cloud and adopting cloud-based enterprise applications, companies can connect data and experiences, which helps to standardize processes and drive change across the organization in parallel. Seventy-two percent of companies with high/medium interoperability have adopted public cloud and have already migrated 30% of their data and workloads.
  2. Use Composable Tech: Moving away from a technology architecture of static, monolithic and standalone parts to creating one comprised of composable pieces helps to boost agility. Repeatable solutions that can be configured and reconfigured to rapidly develop new capabilities enables companies to build flexibility into the core of their business. These solutions can be curated for specific industries and functions and act as a form of future proofing—giving organizations the dexterity to quickly adopt the technologies of tomorrow. With data flowing between connected applications, companies can easily share information with the entire organization so everyone is on the same page.
  3. Meaningful Collaboration: Interoperable applications are only one part of the equation. Interoperability supports meaningful collaboration by allowing functions and people to work together seamlessly toward a common goal. Real-time data, analytics, and AI, together with new ways of working, can unlock the value of technology and empower people and achieve better outcomes. Companies with high interoperability have an unwavering focus on improving human connections.

It’s clear today, more than ever, that companies must anticipate uncertainty in all its forms. By leveraging the cloud, using composable tech, and focusing on collaboration, companies can improve interoperability to overcome obstacles and outpace competitors in growth, efficiency and resiliency.

For more information, also see: Cloud and AI Combined: Revolutionizing Tech 

About the Author: 

Brian McKillips is Senior Managing Director and Growth and Strategy Lead for Enterprise & Industry Technologies, Accenture

The post Interoperability: Thriving in Uncertain Times appeared first on eWEEK.

]]>
3 Ways CIOs Play Both Tech Offense and Defense https://www.eweek.com/cloud/cios-play-both-tech-offense-defense/ Tue, 30 May 2023 22:12:13 +0000 https://www.eweek.com/?p=222260 In today’s uncertain economic environment, business leaders are looking to CIOs to accelerate key priorities while hitting aggressive financial targets. This often means charging ahead with digital initiatives that set us up for future success, while simultaneously lowering costs and optimizing for efficiency. If it sounds like these priorities are often in conflict, you’re not […]

The post 3 Ways CIOs Play Both Tech Offense and Defense appeared first on eWEEK.

]]>
In today’s uncertain economic environment, business leaders are looking to CIOs to accelerate key priorities while hitting aggressive financial targets. This often means charging ahead with digital initiatives that set us up for future success, while simultaneously lowering costs and optimizing for efficiency.

If it sounds like these priorities are often in conflict, you’re not wrong.

Over the past few years, multi-cloud has emerged as one of the most effective ways for organizations around the globe to sharpen their competitive edge. Recent research from Vanson Bourne, commissioned by VMware, found that nearly all (95%) organizations believe multi-cloud architectures are critical for business success.

However, several years into the multi-cloud journey, organizations are starting to see the financial and operational costs of moving fast across multiple clouds. According to the same study, 76% of multi-cloud organizations reported a need to improve control over their cloud expenses.

It’s not enough to just enable the business – it’s incumbent on the CIO to also mitigate spiraling costs. Simply said, as macroeconomic concerns rise, CIOs must play both offense and defense.

Here are three ways we can all be­­­ playing both sides of the field.

Also see: Top Cloud Service Providers and Companies

1. Invest In Cloud Mobility

The value of a multi-cloud strategy is in the flexibility and freedom to access your data from anywhere and at any time. The primary risk in realizing this potential is in commercial and technical lock-in.

If your cloud provider changes their pricing model, are you commercially locked in, or can you seamlessly migrate workloads and applications?

If the next global news headline has you concerned about data privacy and security, does your team have the skills to adopt a new cloud architecture?

By designing and refactoring with mobility in mind, you can easily move applications between cloud providers without requiring significant changes to the underlying architecture, thus reducing the risk of being contingent on a particular provider.

Maintaining mobility also offers benefits such as empowering developers to choose the right cloud for the right app, leveraging best-in-class AI and ML capabilities for cross-cloud analysis and mitigation, and enabling “anywhere work” for employees all over the world while complying with data sovereignty regulations.

Also see: Top Digital Transformation Companies

2. Improve Visibility To Control Cloud Spend

Rising cloud costs can be the Achilles heel of app innovation. While developers drive demand for multi-cloud technology, CIOs are often left to clean up the financial aftermath of successful app launches.

Providing visibility to your application and service owners is the first step to reigning in spend and protecting against ballooning costs. With visibility comes accountability and while previously applications owners could only optimize for security, scale and availability, today, they can now also optimize for cost.

Think about an apartment complex that includes utilities as part of the monthly rate. How closely would you monitor your electricity or water usage if you never receive a bill?

With cloud management solutions, application owners can share in the responsibility of managing spend by looking for patterns and identifying opportunities to reduce use. Just last year, VMware filed a patent for a cloud “dimmer switch” that reduces use the moment it’s no longer needed. This innovation has not only reduced spend but energy consumption, as well.

For more information, also see: Digital Transformation Guide

3. Simplify Your Landscape

In response to the profound pace of change in the industry, we’re seeing businesses be quick to add powerful technology and services to their portfolios, but much slower to fully cut the legacy systems these new tools were meant to replace. This creates a cumbersome tech web that can frustrate your business and slow down innovation.

When companies take the opportunity to reduce the number of legacy applications and systems, they can not only save money and increase productivity, but reduce their attack surface and risk profile. Agility is key here. To be agile and give developers the environment they need, requires that we simplify the landscape.

Additionally, why go to great lengths to attract and hire top talent only to tether them to legacy apps that are no longer fit for purpose? With a simpler landscape, engineers who manage liability and risk are no longer saddled with patching and monitoring a grab-bag of underutilized systems and applications. Shed the weight and invest in the future.

Also see: Cloud Native Winners and Losers 

Bottom Line: Empowering the Team on Both Sides

Leaders need to leverage their employees’ skills and empower them to help their business grow, while enabling customers to prosper. To foster this growth mindset, it requires up- and cross-skilling employees to be successful on both offense and defense: innovating fast, while controlling costs and complexity.

We need to enable teams to get rid of dead weight and embrace new models to drive efficiency and better secure our businesses. At the same time, we need to reward them for thinking about both the short-term gains and long-term ROI of embracing a cloud-smart operating model.

About the Author: 

Jason Conyard is Senior Vice President and Chief Information Officer for VMware.

The post 3 Ways CIOs Play Both Tech Offense and Defense appeared first on eWEEK.

]]>
Real-Time Data Management Trends https://www.eweek.com/big-data-and-analytics/real-time-data-management-trends/ Thu, 23 Mar 2023 14:27:20 +0000 https://www.eweek.com/?p=220394 Real-time data management is the application of intelligence to data as soon as it’s created or acquired, rather than being stored for later analysis. Data is processed and forwarded to users as soon as it’s collected – immediately without any lag. This ultra-rapid data management is considered crucial for supporting real time, in-the-moment decision making. […]

The post Real-Time Data Management Trends appeared first on eWEEK.

]]>
Real-time data management is the application of intelligence to data as soon as it’s created or acquired, rather than being stored for later analysis. Data is processed and forwarded to users as soon as it’s collected – immediately without any lag. This ultra-rapid data management is considered crucial for supporting real time, in-the-moment decision making.

Real-time data is especially valuable for businesses, for a multitude of reasons. It can provide immediate insight into sales trends, and it can also provide immediate insight to security vulnerabilities or degradation of the corporate IT infrastructure.

With digital transformation initiatives well underway, companies are investing in strategies to ingest large volumes of data that enable them to make the right decisions in the moments that matter. Handling the sheer volume and complexity of this data store is exceptionally challenging.

As enterprises meet these data-intensive digital demands, here are five real-time data management trends we anticipate over the next year.

Also see: 7 Digital Transformation Trends Shaping 2022

Data Visualization to Identify Patterns and Trends

Whether it’s real time or sitting at rest in a database, data is nothing more than numbers without visualization. To bring real-time data to life, you need real-time data visualization. Visualization, such as charts, graphs, maps, or other colorful displays, can give you an edge over the competition by mapping out not only the data, but where it is going in terms of activity.

Where data visualization works best is helping to alert companies if something is abnormal or out of the ordinary. For example, a spike in outbound network traffic would show up on the meter, giving a visual cue to anyone watching that there is a sudden flow of traffic which should be investigated.

But it also has positive benefits as well. If a company notices an uptick in sales during certain parts of the day, that would show up on a chart as well and command immediate attention. Real-time visualization enables people to take action in case of emerging opportunities, to capitalize on a potential opportunity or possible negative outcome.

This enables decision-makers to get ahead of the curve and respond quickly and early on, rather than after the fact when an opportunity is missed or damage is done. It also encourages more data interaction, because no one wants to sit and pour through numbers. In contrast, looking at a graph or a chart which summarizes 24 hours of activity in one picture is much more accessible.

With real-time data analysis, companies can identify trends and monitor how well they are achieving goals. They can access data remotely, monitor purchases, manage resources, and help secure their network.

Also see: Top Data Visualization Tools

Data Security is More Important Than Ever

The federal government has issued a Federal Zero Trust Strategy that requires, among other things, adoption of Zero-Trust security measures across the federal government and private sector.

The zero trust strategy will enable agencies to more rapidly detect, isolate, and respond to cybersecurity threats and intrusions. The Office of Management and Budget (OMB) has issued a series of specific security goals for agencies aligned with support existing zero trust models.

For the unfamiliar, zero trust is a new networking design that takes its name literally. One of the knocks on cyber security is that once the bad guys have breached your firewalls, they can move around within your network with impunity. Zero trust network requires validation and credentials to move anywhere within the network. It is a much stricter networking protocol designed to bottle up anyone who breaches the outer wall.

Because of this design, the federal strategy puts a great deal of emphasis on enterprise identity and access controls, including multi-factor authentication, along with encryption and comprehensive auditing.

Also see: Best Data Analytics Tools

Invest in Digital Tools That Improve the Customer Experience 

Research from Boston Digital found 83% of customers are willing to switch to a brand with a better digital experience, and 70% of customers are more likely to trust brands that provide a great digital experience.

That means that customers are increasingly loyal to brands that offer a direct relationship with them and knows their wants and needs. That also means they won’t hesitate to leave you if they’re not getting the experience they want from you, and it’s extremely easy to switch consumer loyalties these days.

Therefore, it is important to invest in the right tools to continuously and actively optimize your digital presence to meet customer needs, and to stay on top of changing trends as they constantly shift.

Also see: Guide to Data Pipelines

Businesses will Reinvent Customer Profiles with Real-Time Data

For years, companies have easily fetched customer identity and other information from cookies. With the end of cookies in 2022, and with more than 70 percent of the world’s population protected by privacy regulations, businesses will have to adapt to new targeting strategies to quickly recommend a product or decide if a transaction is fraudulent.

As identity becomes less of a fixed or known data point, enterprises need to immediately analyze a massive swath of data, look for patterns, and extrapolate a likely persona for targeting. They will need to find patterns in real time that target individuals based on attributes or behaviors other than a cookied identity.

As data volumes constantly grow and demand for real-time transactions increases, these trends will traverse all industries where scaling is instrumental to survival. For example, ad tech is experiencing a renaissance, garnering significant investment, innovation, and attention as platforms rapidly seek to serve ads to targeted audiences at petabyte scale.

Likewise, with massive amounts of data streaming from mobile, 5G, and IoT sensor applications, telecom companies need to quickly ingest data and then process it at petabyte scale with virtually no latency.

As we move forward, enterprises need to embrace the opportunities and challenges ahead of them and manage real-time data in new ways to drive successful business outcomes.

Bottom Line: Real Time Data Management in 2023

Real-time data management is the new normal in business. The days of batch processing on a mainframe overnight and analyzing it the next day are long gone and hard to find. The good news is that there are plenty of tools out there to enable it, ranging from Apache open-source software to commercial software from leaders like IBM and SAP.

Data is also coming from many new sources that weren’t around 10 or 20 years ago. This includes social media, the edge/IOT, and mobile users. So real-time data analytics does not just include your databases and network security, it includes your entire enterprise.

At the same time, data is more regulated than ever. There have been significant data breaches in recent years, and companies have paid massive fines for their sloppiness. So regulatory compliance and protecting your data is as important as extracting value from the data.

About the Author: 

Lenley Hensarling, Chief Strategy Officer, Aerospike

Additionally, tech journalist Andy Patrizio updated this article in 2023. 

The post Real-Time Data Management Trends appeared first on eWEEK.

]]>
Data Sharing Made Easy: The New Era of Data Monetization https://www.eweek.com/cloud/data-sharing-data-monetization/ Fri, 23 Sep 2022 18:01:06 +0000 https://www.eweek.com/?p=221432 As enterprises embrace accelerated digital transformation, new ways of sharing data across organizations without compromising privacy have emerged. As enterprises continue to embrace accelerated technology transformation in response to the disruption presented by COVID-19, some have emerged as digital vanguards, finding new and creative ways to put emerging tech to work. At the center of […]

The post Data Sharing Made Easy: The New Era of Data Monetization appeared first on eWEEK.

]]>
As enterprises embrace accelerated digital transformation, new ways of sharing data across organizations without compromising privacy have emerged.

As enterprises continue to embrace accelerated technology transformation in response to the disruption presented by COVID-19, some have emerged as digital vanguards, finding new and creative ways to put emerging tech to work. At the center of their pioneering success are innovative ways of sharing data across and between organizations without compromising privacy.

As COVID-19 turns toward the endemic phase, continuing challenges echo throughout the global economy. Persistent labor and resource shortages, hybrid work models, an explosion of new devices, and supply chain disruptions, to name a few, require new solutions.

The 13th annual Deloitte Tech Trends Report, Data Sharing Made Easy, shows that pioneering organizations are navigating this volatility by automating, abstracting, and outsourcing many of their historically in-house capabilities to the cloud. Nowhere is this shift more pronounced than in the ascendance of cloud-based data platforms.

Effective data management is at the core of success in this disrupted marketplace. One trend in Deloitte’s Tech Trends Report explains how new technologies are giving rise to advanced business models by simplifying the mechanics of data sharing across and between organizations – without compromising privacy.

Also see: Best Data Analytics Tools 

Unlocking the Possibilities of Shared Data

Making innovative use of shared data isn’t a pipe dream. For advanced organizations, this is reality.

For example, when COVID-19 vaccines became widely available in the spring of 2021, CVS Health (CVSH) leveraged external data from vaccine suppliers and the Centers for Disease Control and Prevention (CDC) to forecast supply and demand.

The team established governance immediately to prioritize data protection and compliance with privacy and data security laws and then fed this information into internal systems that enabled patients to schedule appointments, partners to set up clinics, and analysts to measure campaign effectiveness.

The team also shared data externally with research agencies and universities to help gauge vaccination rates in the population. As the vaccine rollout continued, CVSH used demographic and demand data to identify underserved areas to facilitate access to vaccines where they were most needed.

Connecting more easily with existing partners is only part of the story. Cloud-native data platforms also encourage organizations to seek out and leverage external data that has been traditionally out-of-scope or otherwise off-limits to open a new arena of data-driven opportunities.

For example, industry data marketplaces can allow otherwise fierce competitors to resolve common challenges through collaboration. Consider banks in developing regions: Together, they could pool anonymized credit data to build an interbank credit risk model, unlocking new insights and opportunities to the shared benefit of all.

In the same vein, many manufacturers and retailers already purchase consumer data from third-party data brokers, but that data is often low-quality or too limited to make a significant business impact. Shared high-quality data between willing parties can allow every partner in the value chain, from suppliers to manufacturers to marketers, to pool customer data and create a higher-resolution picture of demand.

Also see: Top Business Intelligence Software 

Acquiring External Data Can Be Easy and Valuable

The recent proliferation of cloud-based data-sharing platforms allow organizations to buy and sell data more easily than in years past. The data sharing-as-a-service model allows subscribers to manage, curate, and tailor data and aggregate and sell access to that data to other subscribers.

These models have already succeeded in music streaming and social media, where vendors provide easy-to-use platforms and customers provide content for sharing. Similar systems are at hand for businesses, and the data marketplace sector is in the midst of a “gold rush” as startups join incumbents to stake their claims on the marketplace.

This model is promising and has driven a significant surge in demand for high-quality, externally sourced data. No longer just a tool to inform high-level decision-making, data is increasingly being considered a business-critical asset that can be explicitly valued and monetized. This sea change underscores the need for savvy companies to explore what data marketplaces might be able to do for their bottom lines.

Still, despite the frenetic acceleration in this space, it remains early days. Governance, security, and pricing models continue to evolve as the business and technology community iterate in response to supply and demand. That said, as new participants continue to join the fray, the volume, variety, and value of these emerging data marketplaces only stands to grow.

Also see: Data Mining Techniques 

Keeping Privacy Issues in Check

Positive projections aside, every emerging technology trend carries the potential for risk. Privacy policies; competitive secrecy; and myriad safety, security, and governance concerns have historically hampered companies’ ability to share their data openly.

Enter a new class of computational approaches known together as privacy-preserving computing, which can make it possible to reap the benefits of data sharing without compromising discretion.

Emerging privacy-preserving techniques are complex and multifaceted:

  • Fully homomorphic encryption allows encrypted data to be shared and analyzed, without first decrypting it.
  • Zero-knowledge proofs enable users to prove their knowledge of a value without having to reveal the value itself.
  • The federated analysis technique allows companies to share insights from their analysis without sharing the data itself.
  • Differential privacy adds noise to datasets, making it impossible to reverse-engineer the original inputs.

At their core, each of these techniques, and others like them, enable rich collaboration without compromising competitive secrecy or data privacy. This “best of both worlds” approach can help mitigate data security risks and earn buy-in from customers, partners, and other stakeholders alike.

Also see: Top Digital Transformation Companies

How to Take the Next Step

What does this mean for businesses moving forward? First, enterprise leaders must stay focused on today’s data management initiatives — even those that can’t be solved by data sharing. Strong data governance, quality, and metadata initiatives, for example, are still essential hygiene for success in the modern marketplace.

Second, technology and business leaders alike must recognize that these new tools and approaches, while potentially disruptive, won’t change their organizational culture overnight. Companies of all sizes have deeply entrenched processes and standards for managing and accessing data.

Established companies may have strict, fixed practices, while startups and digital natives may presume a more relaxed approach. Some businesses may be less willing to share data, or be inherently wary, even with anonymized data. Others may need to take a long look in the mirror and adapt accordingly to overcome these cultural challenges.

Still, these aren’t reasons to avoid exploring and, when ready, embracing the trend. Organizations at the vanguard are already reaping benefits and leaving their less-aware and/or less-prepared competitors in the dust. Enterprises in every industry have an increasingly liquid asset at the ready. Now might be the time to make the most of it.

About the Authors: 

Mike Bechtel, Chief Futurist, and Nitin Mittal, AI Leader, at Deloitte Consulting LLP

The post Data Sharing Made Easy: The New Era of Data Monetization appeared first on eWEEK.

]]>