Exam Code: DP-500 Practice test 2023 by Killexams.com team
Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI
Microsoft Enterprise-Scale approach
Killexams : Microsoft Enterprise-Scale approach - BingNews https://killexams.com/pass4sure/exam-detail/DP-500 Search results Killexams : Microsoft Enterprise-Scale approach - BingNews https://killexams.com/pass4sure/exam-detail/DP-500 https://killexams.com/exam_list/Microsoft Killexams : What is Microsoft’s approach to AI?

Fri, 17 Feb 2023 03:53:00 -0600 en-US text/html https://news.microsoft.com/source/features/ai/microsoft-approach-to-ai/
Killexams : Introducing Adaptive Protection in Microsoft Purview—People-centric data protection for a multiplatform world

At Microsoft, we never stop working to protect you and your data. If the evolving cyberattacks over the past three years have taught us anything, it’s that threat actors are both cunning and committed. At every level of your enterprise, attackers never stop looking for a way in. The massive increase in data—2.5 quintillion bytes generated daily—has only increased the level of risk around data security.1 Organizations need to make sure their information is safe from malicious attacks, inadvertent disclosure, or theft. During the third quarter of 2022, insider risks, including human error, accounted for almost 35 percent of unauthorized access incidents.2 But on the positive side, we’re seeing a growing awareness across all areas of organizations about the need to safeguard data as a precious resource.

Our customers have been clear in voicing their need for a unified, comprehensive solution for data security and management, one that’s as scalable as their business needs. In the Go Beyond Data Protection with Microsoft Purview digital event on February 7, 2023, Alym Rayani, General Manager of Compliance and Privacy Marketing at Microsoft, and I will discuss Microsoft’s approach to data security, including how to create a defense-in-depth approach to protect your organization’s data. We’ll also introduce some groundbreaking innovations for our Microsoft Purview product line—such as Adaptive Protection for data powered by machine learning—and invite new customers to sign up for a free trial. We remain guided by our core belief that security is a team sport. So in this blog, I’ll address how our latest innovations can help your team keep your data safe while empowering productivity and collaboration. We’ll also look at steps you can take to build a layered data security defense within your organization.

A new approach for a new data landscape

We’ve all seen how the ongoing shift to a hybrid and multicloud environment is changing how organizations collaborate and access data. Considering the massive amounts of data generated and stored today, it’s easy to see how this creates a business liability. More than 80 percent of organizations rate theft or loss of personal data and intellectual property as high-impact insider risks.3 Often the risk stems from organizations making do with one-size-fits-all, content-centric data-protection policies that end up creating alert noise. This signal overload leaves admins scrambling as they manually adjust policy scope and triage alerts to identify critical risks. Fine-tuning broad, static policies can become a never-ending project that overwhelms security teams. What’s needed is a more adaptive solution to help organizations address the most critical risks dynamically, efficiently prioritizing their limited security resources on the highest risks and minimizing the impact of potential data security incidents.

Venn diagram showing how Adaptive Protection optimizes data protection automatically by balancing content-centric controls and people-centric context.

Adaptive Protection in Microsoft Purview is the solution. This new capability, now in preview, leverages Insider Risk Management machine learning to understand how users are interacting with data, identify risky activities that may result in data security incidents, then automatically tailor Data Loss Prevention (DLP) controls based on the risk detected. With Adaptive Protection, DLP policies become dynamic, ensuring that the most effective policy—such as blocking data sharing—is applied only to high-risk users, while low-risk users can maintain their productivity. The result: your security operations team is now more efficient and empowered to do more with less.

Adaptive Protection in action

Let’s take a look at how Adaptive Protection can benefit your organization in everyday use. Imagine there’s a company named Contoso where Rebecca and Chris work together on a confidential project. Rebecca and Chris both try to print a file related to that project. Rebecca gets a policy tip to educate her that the file contains confidential information and that she will need to provide a business justification before printing. But when Chris tries to print the file, he gets blocked outright by Contoso’s endpoint DLP policy. 

So, why do Rebecca and Chris have different experiences? The security team at Contoso uses Adaptive Protection, which detected that Chris has a privileged admin role at Contoso, and he had previously taken a series of exfiltration actions that may result in potential data security incidents. As Chris’s risk level increased, a stricter DLP policy was automatically applied to him to help mitigate those risks and minimize potential negative data security impacts early on. On the other hand, Rebecca has only a moderate risk level, so Adaptive Protection can educate her on proper data-handling practices while not blocking her ability to collaborate. This also influences positive behavior changes and reduces organizational data risks. For both Rebecca and Chris, the policy controls constantly adjust. In this way, when a user’s risk level changes, an appropriate policy is dynamically applied to match the new risk level.

With Adaptive Protection, Contoso’s security team no longer needs to spend time painstakingly adding or removing users based on events, such as an employee leaving or working on a confidential project, to prevent data breaches. In this way, Adaptive Protection not only helps reduce the security team’s workload, but also makes DLP more effective by optimizing the policies continuously.

Chart showing how Adaptive Protection applies Data Loss Prevention policies dynamically based on users’ risk levels detected by Insider Risk Management.

Adaptive Protection in Microsoft Purview integrates the breadth of intelligence in Insider Risk Management with the depth of protection in DLP, empowering security teams to focus on building strategic data security initiatives and maturing their data security programs. Machine learning enables Adaptive Protection controls to automatically respond, so your organization can protect more (with less) while still maintaining workplace productivity. You can learn more about Adaptive Protection and watch the demo in this Microsoft Mechanics video.

Fortify your data security with a multilayered, cloud-scale approach

As I speak with customers, I continue to hear about their difficulties in managing a patchwork of data-governance solutions across a multicloud and multiplatform environment. Today’s hybrid workspaces require data to be accessed from a plethora of devices, apps, and services from around the world. With so many platforms and access points, it’s more critical than ever to have strong protections against data theft and leakage. For today’s environment, a defense-in-depth approach offers the best protection to fortify your data security. There are five components to this strategy, all of which can be enacted in whatever order suits your organization’s unique needs and possible regulatory requirements.

  1. Identify the data landscape: Before you can protect your sensitive data, you need to discover where it lives and how it’s accessed. That requires a solution that provides complete visibility into your entire data estate, whether on-premises, hybrid, or multicloud. Microsoft Purview offers a single pane of glass to view and manage your entire data estate from one place. As a unified solution, Microsoft Purview empowers you to easily create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. Now in preview are more than 300 new, ready-to-use trainable classifiers for source code discovery, along with 23 new pre-trained out-of-the-box trainable classifiers that cover core business categories, such as finance, operations, human resources, and more.
  2. Protect sensitive data: Along with creating a holistic map, you’ll need to protect your data—both at rest and in transit. That’s where accurately labeling and classifying your data comes into play, so you can gain insights into how it’s being accessed, stored, and shared. Accurately tracking data will help prevent it from falling prey to leaks and breaches. Microsoft Purview Information Protection includes built-in labeling and data protection for Microsoft 365 apps and other Microsoft services, including sensitivity labels for Outlook appointments, invites, and Microsoft Teams chats. Microsoft Purview Information Protection also empowers users to apply customized protection policies, such as rights management, encryption, and more.
  3. Manage risks: Even when your data is mapped and labeled appropriately, you’ll need to take into account user context around the data and activities that may result in potential data security incidents. As I noted earlier, internal threats accounted for almost 35 percent of unauthorized access breaches during the third quarter of 2022.2 The best approach to addressing insider risk is a holistic approach bringing together the right people, processes, training, and tools. Microsoft Purview Insider Risk Management leverages built-in machine learning models to help detect the most critical risks and provides enriched investigation tools to accelerate time to respond to potential data security incidents, such as data leaks and data theft. Recent updates include sequence detection starting with downloads from third-party sites and a new trend chart to show a user’s cumulative data exfiltration activities. And to help reduce noise and ensure safe and compliant communications, we’ve added a policy condition to exclude email blasts (such as bulk newsletters) from Microsoft Purview Communication Compliance policies.
  4. Prevent data loss: This includes unauthorized use of data. More than 85 percent of organizations do not feel confident they can detect and prevent the loss of sensitive data.4 An effective data loss protection solution needs to balance protection and productivity. It’s critical to ensure the proper access controls are in place and policies are set to prevent actions like improperly saving, storing, or printing sensitive data. Microsoft Purview Data Loss Prevention offers native, built-in protection against unauthorized data sharing, along with monitoring the use of sensitive data on endpoints, apps, and services. DLP controls can be extended to macOS endpoints, non-Microsoft apps through Microsoft Defender for Cloud apps, and to Google Chrome, providing comprehensive coverage across customers’ environments. We now also support in preview DLP controls in Firefox with the Microsoft Purview Extension for Firefox. And now with the general availability of the Microsoft Purview Data Loss Prevention migration assistant, you’re able to automatically detect your current policy configurations and create equivalent policies with minimal effort.
  5. Govern the data lifecycle: As data governance shifts toward business teams becoming stewards of their own data, it’s important that organizations create a unified approach across the enterprise. This kind of proactive lifecycle management leads to better data security and helps ensure that data is responsibly democratized for the user, where it can drive business value. Microsoft Purview Data Lifecycle Management can help accomplish this by providing a unified data-governance service that simplifies the management of your on-premises, multicloud, and software as a service (SaaS) data. Now in preview, simulation mode for retention labels will help you test and fine-tune automatic labeling before broad deployment.

And lastly, we’re making it easier for you to assess and monitor your compliance posture with integration between Microsoft Purview Compliance Manager and Microsoft Defender for Cloud. This new integration enables your security operations center to ingest any assessment in Defender for Cloud, simplifying your work by bringing together multiple services in a single pane of glass.

Data protection that keeps you moving forward fearlessly

Data is the oxygen of digital transformation. And in the same way that oxygen both sustains life and feeds a fire, each organization must strike a balance between ready access to data and securing its combustible elements. At Microsoft, we don’t believe your business should have to sacrifice productivity for greater data protection. This is where Adaptive Protection in Microsoft Purview excels—empowering your security operations center to efficiently safeguard sensitive data with the power of machine learning and cloud technology—without interfering with business processes. If you’re not already a Microsoft Purview customer, be sure to sign up for a free trial

Mark your calendar for Microsoft Secure on March 28, 2023, where you’ll hear about even more Microsoft Purview innovations. This new digital event will bring together customers, partners, and the defender community to learn and share comprehensive strategies across security, compliance, identity, management, and privacy. We’ll cover important Topics such as the threat landscape, how Microsoft defends itself and its customers, the challenges security teams face daily, and the future of security innovation. Register now.

Learn more

To learn more about Microsoft Security solutions, visit our website. Bookmark the Security blog to keep up with our expert coverage on security matters. Also, follow us on LinkedIn (Microsoft Security) and Twitter (@MSFTSecurity) for the latest news and updates on cybersecurity.


1How Much Data Is Created Every Day in 2022? Jacquelyn Bulao. January 26, 2023.

2Insider threat peaks to highest level in Q3 2022, Maria Henriquez. November 2022.

3Build a Holistic Insider Risk Management Program, Microsoft. October 2022.

42021 Verizon Data Breach Report. 2021.

Mon, 06 Feb 2023 03:59:00 -0600 en-US text/html https://www.microsoft.com/en-us/security/blog/2023/02/06/introducing-adaptive-protection-in-microsoft-purview-people-centric-data-protection-for-a-multiplatform-world/
Killexams : Windows 365 and the Coming Abyss in the PC Market

Personal computer sales recently began to slow down to levels not seen since early 2020, before the pandemic turbocharged the PC space.

By all estimates, PC manufacturers are battening down the hatch for yet another challenging year after shipments fell markedly in 2022.

The pandemic-fueled sugar high of PC shipments that the industry saw in 2020 and 2021 began dissipating last year. Most industry analysts are not projecting a complete recovery until 2024.

The PC Boom Is Over

To wrap some numbers around the severity of this decline, Gartner reported worldwide PC shipments fell almost 29% in Q4 of 2022, the most significant drop in nearly three decades.

Market demand for PCs for enterprise buyers, always a good barometer of the PC industry’s health, signaled the slowdown in Q3 2022 by extending PC lifecycles and postponing purchases. Gartner also reported that OEM PC shipments were somewhere in the 65.3 million to 67.2 million range in the Q4 of 2022, with worldwide sales in 2022 coming in at 286.2 million, a 16% decline from 2021.

The PC market surge in 2020 and 2021 was primarily the function of a world that had to quickly adjust to working from home and remote schooling behavioral changes. Businesses and consumers reacted by gobbling up PCs (particularly laptops) at a brisk pace not seen in years. Supply chain issues caused by the pandemic led to long lead times for PCs and critical videoconferencing peripherals like webcams.

Because pre-pandemic conditions have returned for the most part, demand has chilled despite average selling prices declining as PC OEMs resorted to the age-old tactic of discounts and promo offers to clear a glut of high inventory levels.

In short, the salad days of PC market growth appear to be over for the next couple of years.

Windows 365 May Turn the PC Market on Its Head

Even as the major PC OEMs trim expenses in anticipation of another challenging year, a strong headwind that could retard the PC industry’s return to growth is Microsoft’s Windows 365. It’s not nearly getting the level of attention I believe it merits.

Still, given the impact of persistent inflation on IT and perhaps even consumer spending, Windows 365 may get much more appealing. The revenue impact of this could be devastating for traditional PC makers.

Arguably, Windows 365 is the future of the PC for several reasons.

First, this new cloud-based capability from Microsoft offers a comprehensive suite of productivity tools critical for businesses and individuals. These tools include Office 365, which provides access to popular applications such as Outlook, Word, Excel, and PowerPoint, and cloud storage services like OneDrive.

This attribute means that users can work from anywhere, on any device, and have access to their documents, files, and presentations no matter where they are.

Another compelling reason why Windows 365 is attractive to enterprise and business users is that it is constantly evolving. Unlike software packages that require a large and expensive IT staff to manage, the platform is updated regularly with new features, bug fixes, and security enhancements, ensuring that users always have access to the latest tools and technologies.

This feature allows users to stay ahead of the curve on perpetual updates and instead focus on their productivity requirements.

In addition to its productivity tools, Windows 365 also provides robust security features that have numerous advantages over a traditional hardware device model.

With advanced threat protection, device management, and data protection that can be managed and controlled in the cloud, sensitive data can be secured and protected more consistently. This need is vital for CIOs and CSOs who struggle with updating large fleets of laptops with the latest security fixes and patches and application management.

Finally, Windows 365 has the potential to become more cost-effective than purchasing traditional PCs. With a subscription-based model, users only pay for what they use and can quickly scale up or down as their needs change. This element makes Windows 365 an ideal solution for businesses of all sizes — and individuals — allowing them to manage their costs and stay within budget.

With average wireless broadband speeds now climbing north of 200 Mbps in homes, offices, and even in public settings as fiber and 5G becomes more pervasive, the burden of computing speed shifts to the cloud and away from the local device itself. This consequence has enormous hardware cost savings implications.

Chromebooks Still Fall Short for Enterprise Users

The idea of running an operating system in the cloud is not new. Google has been doing this for years with Chromebook and its ChromeOS. First announced in 2011, a Chromebook relies heavily on web applications using the Google Chrome browser.

Chromebooks can work offline and utilize applications like Gmail, Google Calendar, and Google Drive to synchronize data when connected to the internet.

While Chromebooks have enjoyed significant success with educational institutions due to their low cost, their penetration into the enterprise space has been lackluster. The enormous dependence that corporate users have on using Windows-based applications has been a major hurdle for Chromebook to overcome, despite its significantly lower cost versus a traditional Windows-based laptop.

Chromebook 3110 Laptop

The Chromebook 3110 Laptop is designed for the education market. (Image Credit: Dell)


Ironically, the brutal truth is that most users depend on Microsoft 365 (Word/Outlook/Excel/PowerPoint) for word processing, email, spreadsheet modeling, and presentation needs. Google Workspace may be a great collaborative workplace platform, but many businesses (and even schools) have standardized on Microsoft 365 and OneDrive.

To bridge this gap, Microsoft and Google have joined forces to make Microsoft 365 a premium experience on ChromeOS, with full OneDrive integration for the Files app in the cards. Google recently posted an update that will bring this functionality to Chromebooks over the next several months.

However, for most enterprise users who require access to Windows-based apps where significant dollars have been invested over decades, this remains a “close but no cigar” situation, with Chromebooks remaining a non-starter for many large enterprise customers.

Implications for PC Makers

Not too long ago, PC manufacturers could rely on Microsoft to release a new version of Windows to spur an upsurge in PC sales. As a marketing executive with Compaq who attended the famous Windows 95 announcement in Redmond nearly 30 years ago, I witnessed the excitement — and even hysteria — that Microsoft was able to conjure up.

While Apple remains one of a tiny number of companies that can still summon that level of excitement on a massive scale, Microsoft’s ability to replicate that level of user enthusiasm is now in the rear-view mirror. I don’t mean that derisively. Microsoft is a much different and larger company today than in 1995, and it remains one of the world’s great financial successes from a revenue and profitability standpoint.

But now, Microsoft realizes that connectivity improvements from a speed and latency standpoint over the past 20 years have made a “Windows in the cloud” scenario plausible and likely.

One could debate if a cloud-based Windows experience were sufficiently satisfying for all but the most demanding PC users, e.g., video content professionals and gamers.

Still, it’s hard to repudiate that the cloud is the ultimate end game, and it’s just a matter of time. Also, we cannot dismiss that the hybrid work phenomenon has played a role in making “Windows in the cloud” more appealing to business users.

Justifying the Cost of Windows 365

Simply stated, Windows 365 abolishes the need to buy a new PC.

While cost will always be a consideration for IT departments, Microsoft’s monthly subscription price for businesses spans from $32 to $162 per user, depending on the number of processor cores, memory, and storage capacity.

Companies looking to get out of the hardware management business might find Windows 365 very tempting. Of course, you’ll still need a lightweight hardware computing client, but it most likely won’t be a $1,500 laptop.

Assuming that even a modest adoption of Windows 365 occurs over the next two or three years, the reduction in PC application service providers (ASP) will shake the rafters in the offices of every major PC manufacturer around the world — especially in Palo Alto/Houston, Austin, and Raleigh, homes of the three largest PC OEMs with a combined market share of more than 60%.

Even if only 20% of the PC market transitions to a Windows 365 business model over the next few years, the resulting impact due to dramatically lower ASPs could result in billions of dollars in lost revenue. That’s serious stuff.

I’m aware of the enormous internal resistance that a PC OEM must face to confront this reality. Some OEMs will espouse a view that Windows 365 offering will not be a fait accompli and that the declining ASP reduction transition can be gracefully managed.

I’m dubious that scenario will play out as it will only take a few companies, assuming the user experience with Windows 365 is reasonable and doesn’t impact overall productivity, to start messaging to the world how beneficial a cloud-based Windows approach is from a support, security, and image management standpoint.

Potential Impact on PC ASPs

From my perspective, the wiser PC OEMs will try to get ahead of this. Yes, there will be ASP declines, but there will also be an opportunity to define the experience consistent with the company’s brand, value proposition, and security offerings that distinguish it from competitors.

But let’s be clear: the anticipated declines in PC OEM ASPs as Windows 365 gains traction will probably not hit Chromebook-like levels because enterprise customers will value 4K displays, 5G capability, and other “care abouts” as the bill of materials emphasis shifts away from legacy processor players like AMD and Intel.

Notably, Qualcomm could benefit because of its line of low-cost Snapdragon SOCs that are tailor-made for Windows 365-class “thin” clients.

In addition, while PC clients may become more lightweight from a local processor, storage, and even graphics standpoint, those PC OEMs with world-class industrial design capability will have a leg up. These clients may have significantly lower price points, but end users won’t forego great-looking designs.

While this conversation has focused primarily on enterprise and corporate businesses that tend to purchase PCs in fleets, the consumer market may be next on Microsoft’s agenda. Some media reports state that Microsoft might offer Windows 12 to consumers via a low-cost Windows 365 subscription.

If anything, Windows 365 will presumably extend the PC’s traditional hardware lifecycle beyond the current two to three years that most businesses use before replacing a conventional PC. If these new lower-priced PC clients facilitate significantly longer lifecycles, the impact on PC market revenue could be enormous.

Indeed, 2023 may be the year where we see this transition begin to occur. While the financial pain in lower revenues will not likely be unavoidable, the brave PC OEMs who get in front of this freight train have the best chance to exploit the opportunity and reinvent themselves.

Final Thoughts

In the late 1990s, desktop PC pricing started to decline precipitously due to strong market competitiveness, sending most PC OEMs into a tizzy as ASPs began to creep under the then-unheard-of $1,000 price band.

Keep in mind that Microsoft’s current Windows 365 pricing for business customers is still expensive when considering that it doesn’t include any hardware, only the license to use Windows 365, some productivity software, and product support.

It remains to be seen what price point PC OEMs will have to offer for a laptop client to make a Windows 365 monthly subscription viable from a cash outlay standpoint versus a traditional PC purchase that includes Windows 11.

Compaq dealt with this challenge by offshoring its PC manufacturing, designing PCs with lower-cost components, aggressive procurement, and implementing lifecycle management to a “just in time” inventory model that helped maximize margins.

Despite the tumult this caused at Compaq, the overall mantra was that while no company likes it when a market gets cannibalized, a company is much better off when it cannibalizes itself rather than letting competitors do it.

That might be the most potent lesson to embrace for the leading PC OEMs from the late 1990s.

Thu, 16 Feb 2023 00:00:00 -0600 en-US text/html https://www.technewsworld.com/story/windows-365-and-the-coming-abyss-in-the-pc-market-177839.html
Killexams : An Open Approach To Hybrid Data Clouds

In a accurate episode of the Six Five Insider podcast, we spoke with Ram Venkatesh, CTO of Cloudera. Our discussion focused on hybrid multi-cloud deployments and associated data management challenges.

As many of you know, it can get tricky to stitch together analytic functions across multiple clouds; this is the sweet spot Cloudera addresses with hybrid data clouds. Hybrid data cloud technology is critical to seamlessly move data and workloads back and forth between on-premises infrastructure and public clouds, handling both data at rest and data in motion.

The unique capability that Cloudera brings to the market is firmly grounded in Cloudera's open-source approach. In Venkatesh’s words, “We ensure that any API or file format or engine conforms to an open standard with a community.” In this article, we’ll delve into how an open-source approach has enabled Cloudera to deliver data services with complete portability across all clouds.

Apache Iceberg: Build an open lakehouse anywhere

Apache Iceberg started life at Netflix to solve issues with sprawling, petabyte-scale tables; it was then donated by Netflix to the open-source community in 2018 as an Apache Incubator project. Cloudera has been pivotal to the expanding Apache Iceberg industry standard, a high-performance format for huge analytic tables.

Those conversant with traditional structured query language (SQL) will immediately recognize the Iceberg table format, which enables multiple applications such as Hive, Impala, Spark, Trino, Flink and Presto to work simultaneously on the same data. It also tracks the state of dataset evolution and other changes over time.

Iceberg is a core element of the Cloudera Data Platform (CDP). Iceberg enables users to build an open data lakehouse architecture to deliver multi-function analytics over large datasets of both streaming and stored data. It does this in a cloud-native object store that functions both on-premises and across multiple clouds.

By optimizing the various CDP data services, including Cloudera Data Warehousing (CDW), Cloudera Data Engineering (CDE) and Cloudera Machine Learning (CML), users can define and manipulate datasets with SQL commands. Users can also build complex data pipelines using features like time travel, and deploy machine learning (ML) models made from data in Iceberg tables.

Through contributions back to the open-source community, Cloudera has extended support for Hive and Impala, achieving a data architecture for multi-function analytics to handle everything from large-scale data engineering workloads to fast BI and querying as well as ML.

Cloudera has integrated Iceberg into CDP’s Shared Data Experience (SDX) layer, so the productivity and performance benefits of the open table format arrive right out of the box. Also, the Iceberg native integration benefits from various enterprise-grade features of SDX such as data lineage, audit and security functionality.

Cloudera assures that organizations can build an open lakehouse anywhere, on any public cloud or on-premises. Even better, the open approach ensures freedom to choose the preferred analytics tool with no lock-in.

Apache Ranger: Policy administration across the hybrid estate

Apache Ranger is a software framework that enables, monitors and manages comprehensive data security across the CDP platform. It is the tool for creating and managing policies to access data and services in the CDP stack. Security administrators can define security policies at the database, table, column and file levels and administer permissions for specific groups or individuals.

Ranger manages the whole process of user authentication and access rights for data resources. For example, a particular user might be allowed to create a policy and view reports but not allowed to edit users and groups.

Apache Atlas: Metadata management and governance

Apache Atlas is a metadata management and governance system used to help find, organize and manage data assets. Essentially, it functions as the traffic cop within a data architecture. By creating metadata representations of objects and operations within the data lake, Atlas allows users to understand why models deliver specific results, going all the way back to the origin of the source data.

Using the metadata content it collects, Atlas builds relationships among data assets. When Atlas receives query information, it notes the input and output of the query and generates a lineage map that traces how data is used and transformed over time. This visualization of data transformations allows governance teams to quickly identify a data source and understand the impact of data and schema changes.

Apache Ozone: Open-source answer for dense storage on-premises

Separating compute and data resources in the cloud provides many advantages for a CDP deployment. It presents more options for allotting computational and storage resources and allows for server clusters to be shut down to avoid unnecessary compute expense while leaving the data available for use by other applications. Additionally, resource-intensive workloads can be isolated on dedicated compute clusters separated for different workloads.

For these advantages to be consistent everywhere, including on-premises, CDP Private Cloud, the on-premises version of CDP, uses Apache Ozone to separate storage from compute. Apache Ozone is a distributed, scalable, high-performance on-premises object store that supports the same interaction model as AWS S3, Microsoft Azure Data Lake Storage (ADLS) or Google Cloud Storage (GCS).

Wrapping up

Cloudera is the standard-bearer for the industrialization of open-source data management and analytics innovation, which I believe is a winning strategy. History has taught us that enterprises vote with their dollars and are unlikely to reward closed or proprietary platforms or those built by a single vendor without a broad ecosystem.

Cloudera is one of twenty vendors in the crowded cloud database management systems (CDMS) marketplace. Selecting the vendor for your specific needs can be a daunting task. The vendor's approach to openness must be a critical factor in your selection because, in any enterprise deployment, data will originate from many locations and must work with both source and destination systems in a much more open way. Any software you use needs to be built with that in mind.

In this context, the Cloudera strategy to harness multiple open-source systems to deliver hybrid multi-cloud solutions and offer the most choice to customers is bound to enjoy a continuous advantage in terms of innovation and interoperability.

The biggest enterprises with large amounts of data see Cloudera as the right company to manage that end-to-end data on-premises or in the public cloud—or even collecting data that comes through a SaaS application. Cloudera is doing an excellent job of pulling it all together as a one-stop shop for large-scale data management.

Moor Insights & Strategy provides or has provided paid services to technology companies like all research and tech industry analyst firms. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and video and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Ampere Computing, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Cadence Systems, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cohesity, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, HYCU, IBM, Infinidat, Infoblox, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Juniper Networks, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, LoRa Alliance, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, Multefire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA, Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), NXP, onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler.

Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Fivestone Partners, Frore Systems, Groq, MemryX, Movandi, and Ventana Micro.

Wed, 15 Feb 2023 22:59:00 -0600 Patrick Moorhead en text/html https://www.forbes.com/sites/patrickmoorhead/2023/02/16/an-open-approach-to-hybrid-data-clouds/
Killexams : Gatsby, Netlify, and the gravitational pull of general-purpose platforms

It’s been over a week since Netlify purchased Gatsby on February 1, 2023, “to accelerate adoption of composable web architectures,” and all the buzzwords are finally falling into place. Gone is any mention of the Jamstack (though you can still find its community humming along). Gone are the comments about how the two compete. Today it’s all love and roses between the two companies because they’ve jointly understood what all enterprise upstarts eventually realize:

As much as enterprise decision-makers may say they want best of breed, what they buy is general-purpose, one-size-fits-many solutions. Developers may want sexy and cool, but CTOs want safe and sustainable. The enterprise has ever been thus.

An end to entropy

I’ve addressed this syllabu of “general purpose” before and how it affects developer decisions around databases, but it goes well beyond databases. As much as individual developers may want to fiddle with their preferred Jamstack (JavaScript, API, Markup) web framework, of which there are many—the most accurate Jamstack community survey lists 29—in reality, the front-end development community has for years been coalescing around just a few: React, NextJS, etc. That left little room for a GatsbyJS, however impressive it might be.

And it’ll need to do more than stand out—or, rather, to stand firm: firm against the enterprise’s need for conformity, for sustainability. For order. There’s a reason enterprise technology invariably settles on just a handful of vendors in any given category. It’s not that these necessarily offer the best technology, but rather that they provide the best technology experience for both enterprise developers and enterprise decision-makers.

Small wonder, then, that we’re seeing the highly fragmented web development world start to rein in entropy.

Sam Bhagwat, Gatsby cofounder and chief strategy officer, observes, “The last 10 years we spent out building all the primitives, and then the next 10 years—or however long it takes—we’re going to spend combining them in ways that make the development process, the page-building process, the site-building process easier to use for everyone.”

Copyright © 2023 IDG Communications, Inc.

Sun, 12 Feb 2023 20:00:00 -0600 en text/html https://www.infoworld.com/article/3687672/gatsby-netlify-and-the-gravitational-pull-of-general-purpose-platforms.html
Killexams : What Investors Need to Know About the Latest News From Microsoft What Investors Need to Know About the Latest News From Microsoft © Provided by The Motley Fool What Investors Need to Know About the Latest News From Microsoft

In this podcast, Motley Fool analyst Dylan Lewis and Motley Fool Chief Investment Officer Andy Cross discuss:

  • Microsoft's quarter and lower growth expectations for its cloud business.
  • What long-term investors can expect from Microsoft.
  • The Department of Justice's suit against Alphabet, and a shifting regulatory environment.
  • Kimberly-Clark's "less than stellar" quarter.

Plus, Motley Fool Canada's Jim Gillies joins Motley Fool producer Ricky Mulvey to deliver the bull case for one of the most heavily shorted stocks of 2022: Big Lots.

To catch full episodes of all The Motley Fool's free podcasts, check out our podcast center. To get started investing, check out our quick-start guide to investing in stocks. A full transcript follows the video.

SPONSORED:

10 stocks we like better than Walmart

When our award-winning analyst team has an investing tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has tripled the market.*

They just revealed what they believe are the ten best stocks for investors to buy right now... and Walmart wasn't one of them! That's right -- they think these 10 stocks are even better buys.

See the 10 stocks

Stock Advisor returns as of January 9, 2023

John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Andy Cross has positions in Activision Blizzard, Alphabet, Microsoft, and Trade Desk. Dylan Lewis has positions in Alphabet, Amazon.com, and Trade Desk. Jim Gillies has positions in Amazon.com, Big Lots, and Costco Wholesale. Ricky Mulvey has positions in Trade Desk. The Motley Fool has positions in and recommends Activision Blizzard, Alphabet, Amazon.com, Costco Wholesale, Microsoft, Trade Desk, and Walmart. The Motley Fool recommends Big Lots, Ollie's Bargain Outlet, and Silvergate Capital. The Motley Fool has a disclosure policy.

This video was recorded on Jan. 25, 2023.

Dylan Lewis: All eyes on the cloud at Microsoft and Alphabet gets attention from the DOJ. Motley Fool Money starts now. Sitting in for Chris Hill, I'm Dylan Lewis and I'm joined by the Motley Fool's Chief Investment Officer, Andy Cross. Andy, how is it going?

Andy Cross: Dylan, good to see you on the start of earnings here.

Dylan Lewis: I know, it's exciting. We've got some new updates on some of the companies we follow, and I think maybe a few companies are as heavily followed as Microsoft. One of the first Big Tech companies to come out with earnings and one of those companies you own, whether or not you know that you own it, right, Andy?

Andy Cross: Yeah. It's so widely owned, near two trillion dollars market cap and so instrumental into so many parts of the world, and certainly has been a big contributor to the massive growth of the S&P 500 and the Nasdaq-100 over the last decade.

Dylan Lewis: A lot of bated breath before they reported earnings yesterday. I think surface level, when you look at the numbers, things look pretty strong to me.

Andy Cross: It was a strong quarter on the real growth part of their engines, Dylan, or at least I'll say maybe not as terrible as people were panic about on the cloud side. I think that's been the big headline. Really continued weakness on the personal computing side. That's where you really seen a lot of the weakness, but the cloud business continued to shape up with a pretty decent quarter. Of course, the guidance was a little bit weak and that's what really pressured I think the stock today. The tone of the call, Dylan, was really interesting because I think both Amy Hood, the CFO, and Satya Nadella, the well-respected CEO, they've been talking so much about how much inroad they've been making on the cloud and the productivity business and how well it's been received, but this tone of this call was much more muted.

Satya Nadella kicked it off by pretty much saying, just as we saw, customers accelerate their digital spend during the pandemic. We are now seeing them optimize that spend. Optimize, a euphemism there. Also, organizations are exercising caution given the macroeconomic uncertainty, and then Amy Hood went in to further echo that throughout the call. If you look, the revenues are up about two percent. Earnings per share were down a little bit, both within the matching the expectations game. The productivity and the processing business or the processes business was up seven percent at 17 billion. The Intelligent Cloud was up 18 percent and the personal computing, as I mentioned, was the really weak spot, down 19 percent.

Both of those are not on constant currency, the strong dollar continues to have an impact. The guidance, though, Dylan, was what I think really what people were focused on and that's on the cloud side of the business. The expectations that the Azure business, their cloud business that has become such a dominant player in the cloud, I think now the second largest player, that grew 38 percent on a constant currency basis, but they are expecting that to fall by 4-5 percentage points over the next quarter and drop down probably closer to the 30 percent growth level. So much of the focus for Microsoft has been on that cloud growth. That weakness has really gotten investors I think a little spooked on what it means, not just from Microsoft, but also the wider tech and mega tech space.

Dylan Lewis: Yeah. On Microsoft earnings, we're also seeing maybe a little pessimism. When it comes to shares of companies like Amazon and Alphabet as well, just Microsoft being a little bit of a bellwether for cloud trends. When you're thinking about those businesses, these segments have really bolstered the financials and helped them offset some flagging segments and other parts of their business. How concerned are you with what we're seeing in the cloud? At a certain point, big things do have to slow down in their growth rate. We're still talking about 30 percent growth. Is this something that people should be panic about?

Andy Cross: Obviously, Microsoft, such a large company and it takes so huge investments to move the needle. They're trying to push through this Activision Blizzard acquisition sometime this year. The regulatory bodies are still fighting that, so the concerns that that won't go through. They have the other businesses outside of the cloud business tied to their Office 365, their LinkedIn business, the gaming business, the personal computing business, the Windows business, all those businesses that Microsoft has been known for so long and have contributed to the long-term growth story. But the Intelligent Cloud, the cloud business has become their largest business now.

Especially as they are making the big push into AI or artificial intelligence and the investments they've been making into OpenAI, and they talked about that recently, they've now made three investments into the OpenAI business and the platform or that entity, I should say. You want to see continued robust growth there because it does drive high-margin usage and engagement. As I think about the ecosystem to Microsoft, such an important part of their business is cloud. You want to see continued growth there, of course, as you mentioned not all things can grow to the sky. The Microsoft stories, as I see and I still like the stock for long-term investors, you have a little bit of a dividend yield.

The stock is probably a little bit on the higher price side, you're paying 25 times this year's earnings. They have a fiscal year that ends in June, so this year's earnings. There's healthy growth baked into that, looking out a year or two. So they have to deliver on that growth in an environment that is starting to see a little bit of a slowdown or more of a macro slowdowns we're seeing not just for Microsoft, but for some other companies. I think investors have to own Microsoft, still think about this as a long-term investment. Returns, this is not a business that's going to double in a couple of years. But for high single-digit kind of gains for patient investors over the next 3-5 years, I think Microsoft can probably deliver that depending on what kind of macro environment we run into over the next 12 months.

Dylan Lewis: We'll get a little bit more of an update on the cloud and what that market looks like when we see results from Alphabet and from Amazon. Speaking of Alphabet, we have some non-earnings news to hit. Yesterday, the US Department of Justice filed an antitrust suit against the tech giant, targeting the company's digital advertising products. Andy, I want to emphasize there, digital advertising products. This is not the first time DOJ has been reaching out to Alphabet in the last couple of years, and this is separate from some of the search concerns that they've had in the past.

Andy Cross: Oh my gosh, yes. The stock is down a few percentage points. With this news, I think the DOJ, Department of Justice, coming out and really attacking the underpinnings to Google's technology, especially with their acquisition that they made years ago with DoubleClick for about three billion dollars and how they package together. By the way, approved by regulators, and now some blustering conversation about splitting that apart and really saying that they've been anti-competitive, their prowess in that technology in the ad market, in the ad exchange part of the Google world, which is the guts of really their ad business, the practices they've undertaken had been anti-competitive. They are exploring that and investigating that, and that's just on that ad partly you mentioned separate from the search one.

Of course, Alphabet and Google came out and completely denied it. They said it reminds them and is akin to the lawsuit or the investigation that Attorney General in Texas kicked off in 2020 I think it was. This will go on for many, many years and discussions. It's always been a risk factor with owning Alphabet and Google, and I do and I continue to like the investment. I continue to think that it could be a buying weakness, and can be a buying weakness. This is a serious allegation and investigation that they are going to have to defend and the eight states that have joined this suit. There'll probably be others that hop on and whether they have acted inappropriately and illegally and been anti-competitive in a very competitive market, the ad space is very competitive and there's a lot of players involved there.

But how they have performed, we still don't know the exact details and the Department of Justice in their investigation is using some of that internal documents from Google and some of the emails and some language that I still think needs to be better defined and understood. But Big Tech regulation has been a focus of both this administration and the prior administration. This is an extension of that outreach to try to bring investigations and suits against Big Tech companies, if not forcing them to change practices or, at the extreme, perhaps split off businesses and break them up. But we'll have to see how those all plays out. We're in the very, very early innings of just this investigation. As you mentioned, there are some others going against Google.

Dylan Lewis: Yeah. If you're looking for a parallel on timeline there, I believe the search investigation and antitrust suit was filed in 2020 and is going to trial later this year. So this is the ultimate "you got to watch it play out" kind of thing. Andy, you mentioned that they acquired DoubleClick and this is really the digital tool that all of the web publishers are using to sell ads on their website back in 2008 for three billion dollars. To some extent, I look at this as a consequence of having something that was wildly successful and maybe a little bit of the regulatory environment changing, where the last 20 years of tech and maybe the next 10 years of tech look a little bit different in terms of how regulators approach Big Tech acquisitions.

Andy Cross: Well, it's probably right, Dylan. Google in so many ways or at least in some ways really created this market and has been instrumental, and I like The Trade Desk and I'm a fan of Jeff Green, and Jeff Green had an ad exchange business. I think he ended up selling it to Microsoft. Microsoft obviously is a large player now in the ad market. Amazon has come out of nowhere to basically take 10, 11, 12 percent of the market of digital advertising space. It's the mechanism in which Google has packaged together. They have DoubleClick technology with all the various parts to add bidding and the kind of bidding that Google likes to do that is different than maybe some competitors want to try to do and Google fighting that off.

There's lots of details to the mechanisms of how consumers see ads when they load up a webpage in the free Internet. Or in some of the walled gardens like YouTube, for example, and see and the mechanism for ad clients to bid on that. So a lot of detail to go through this, but you're absolutely right. When they made this acquisition, they were far smaller company. It was not as proven. Obviously, ad tech still working through its growth spurt a little bit, just become dominant in such an important part of the market it is today, and the regulatory environment is shifting. The way that it seems that regulators and Congress is talking about Big Tech and megatech, it starts to remind you a little bit more of the way they were talking about Microsoft in the late '90s.

Dylan Lewis: Before we wrap up today's show, there is news outside the world of Big Tech, I swear. We also saw earnings from Kimberly Clark. Perhaps most relevant to people who own the stock, Andy, the company announced its like clockwork update to its dividend program this year.

Andy Cross: Yeah. They raised it a little bit, Dylan. Kimberly Clark has been raising their dividend for, gosh, I don't know, 50 years or so. It's just one of those stalwart. Every year, sorry, just continues to pay a dividend. Now yields more than three percent, which in this market isn't what it used to be. Dylan, as you and I were talking about before we got on the air, your bank accounts or many online savings accounts yield close to four percent now. However, they have raised that dividend. It wasn't a very stellar quarter they reported. What was interesting is they continue to see pricing prowess, but the volume drop and the guidance for the rest of the year was a little bit disappointing.

I think that puts some pressure on the stock. Kimberly Clark as a consumer staple, like many other consumer staples, have seen their stock prices bid up now, where they're selling at 23, 22, 25 times earnings for a consumer staple company that really grows less than GDP levels, pays a little dividend, uses a lot of leverage to continue to get some growth and makes acquisitions. That's a little bit of an expensive proposition to pay for a business that isn't going to grow a whole lot, even for a consistent dividend payer. I'll say my final comment. While they did raise that dividend, I think about two percent this quarter, Dylan, I think historically, they've been more between four and five percent.

The ratio of profits to what they pay that dividend out, called the payout ratio, that continues to creep up higher and higher. I think investors want to see that a little bit lower because that gives confidence they can raise the dividend over time. Again, Kimberly Clark, I think the business has been around since the 1870s in some shape or form. We all need Kleenex, and if you have kids, you need diapers and those kinds of things in a consumer staple. But at this level, investors are definitely paying up for some expectations of some kind of stable, single-digit kind of growth in the stock and in returns. If the market and the economy is souring, that could prove a little bit tough to meet those expectations.

Dylan Lewis: Earlier, we talked about Microsoft as a bellwether for tech. Is there anything you'd see in the report from Kimberly Clark that you think people should be paying attention to just in the consumer package goods space or in retail?

Andy Cross: I think the pressure on growth on the volume side from a consumer growth perspective, I think we saw hints at this going into 2023, which is some of the slowdown in the retail side. I think the consumer is going to continue to be much more specific and particular about how they're spending their money, even on things like consumer staples and consumer goods with lots of different options out there and ways to spend that. That balance for companies, the balance between volumes and pricing and what that will look like going forward in an environment that maybe we don't have as much inflation over the next, say, year or two, at least compared to what we had last year. I think that's safe to say.

Whether it's going to be three percent, two percent, five percent, it will be lower than what it was last year and how companies manage the balance between pricing and volumes, and what that means to the scale when it comes to profits for the organizations and for investors who are looking for some kind of profit growth over the next year or two. Kimberly Clark, I think what we saw is that they are seeing pressure on the volumes and maybe some questions on what that means for pricing going forward to be able to continue to drive profits. That's a little bit of a takeaway and I think we'll hear more and more of that balance from consumer goods companies going forward this year.

Dylan Lewis: Andy Cross, excellent as always. Thank you so much for joining me.

Andy Cross: Thanks, Dylan. 

Dylan Lewis: We've got the bull case for one of the most heavily shorted stocks of 2022. Jim Gillies joins Ricky Mulvey to discuss a discount retailer with very low expectations. 

Ricky Mulvey: If investing is about expectations, then the bar is awfully low for Big Lots discount retailer, with 1,400 locations across the United States. Type of store that has some essentials, a little bit of a treasure hunt. You can get a couch and a six-foot tall nutcracker statue. Joining us now to talk about this retailer is Motley Fool Canada's Jim Gillies. It's a weird company, Jim.

Jim Gillies: It is. As I think we're going to talk, I think we're going to unfold that we're not sure how it's going to unfold, but it could be fun.

Ricky Mulvey: People want answers. The thing that's odd about this company is it was on the list of the most heavily shorted stocks of 2022. That includes Carvana, Bed Bath & Beyond, Silvergate Capital, which is an alleged bank that does cryptocurrency lending, and Big Lots. It's had inventory issues, it's had management missteps, but does Big Lots deserve to be in that club?

Jim Gillies: I'm going to say unequivocally no. Carvana and Bed Bath & Beyond, well, Bed Bath & Beyond, I call them either alternatively bloodbath and beyond or Bed Bath & Beyond Hope, to your choice, you've got at least two of the three companies you mentioned, at least two of those are bankruptcies waiting to happen, that would be Bed Bath & Beyond Hope and Carvana.

Silvergate Capital, we'll see. No, Big Lots, it's not going bankrupt anytime soon, probably no time soon or even further out. I'm not sure why it's this heavily shorted as it has been. I don't tend to spend a lot of time worrying about what the shorts are doing. I'm a fan of certain short sellers who do a lot of very good work, but I've not seen one. The names that I would consider a short seller who would make me sit up and take notice, I've not seen anything from any of that group talking about Big Lots. So no, it's fine.

Ricky Mulvey: Management has made a few missteps. The one that I have questioned with this company is it is complaining about inventory challenges. This is a company, it's in the name that it's supposed to take advantage of inventory challenges of other companies. Why is Big Lots struggling with inventory, when that's the promise of the store?

Jim Gillies: It is. Can I maybe take a bit of a disagreement with both you and with management? Is that copacetic and cool?

Ricky Mulvey: We'll see. No, of course.

Jim Gillies: They are like a retailer. Retailers tend to have end of January fiscal year, so we are still in the middle of their Q4. Even though it's fiscal '22, it ends at the end of January 2023. Year-to-date, their cash used due to inventory is only, I say only, $107.5 million. The reason I say only is because fiscal '21, so the 12 months ending in January of 2022, a year ago, they blew 337 million on inventory. I would suggest that the year where inventory caused problems or created the problems, I suppose, is actually last year and they're digging it out right now. This is a year-to-date number.

Year-to-date, the big swing in cash, they were free cash flow generative last year. In spite of their inventory issues, they actually made about $80 million, I think, in free cash flow for the year. The thing that has crushed them is that they didn't pay their bills. They took on inventory, but they just racked up their payables last year. This year, they've had to fight that. They've had to pay it back down. I think the cash flow for what I'm going to call working capital, the cash outflow from working capital, I think it's a hangover from last year. When management starts blaming inventory, I say you're distracting us from maybe a little bit of your management sins perhaps, which of, course, overbuying inventory is one of them.

Ricky Mulvey: It could also include buying back stock at $54 a share, which is not so great when you're trading at about $17 a share now, and also growing its long-term debt load from 40 million in 2021 to $460 million today.

Jim Gillies: Yeah. The last year, they basically put the last year or at least year-to-date on the company credit card, and that sounds bad and it should sound bad. Don't put your life on your company or on your credit card. It will cash-flow your life from your salary or whatever if you're putting all your bills on a company credit card. Eventually, that comes due and is generally fairly painful. But I think there's reason to think that Q4 is quite possibly going to be a cash flow positive quarter, number 1. So when that happens, they should be taking down. It's about just under 400 million in net debt because they deliver about 60 plus million in cash. So net debt is about 400,000, 398. They halted their buybacks after Q1 of this fiscal year. Smart, they should, they're burning capital other places.

But this is a company that has, long term, really been actually fairly good with the buybacks. They've meaningfully reduced their shares outstanding from over 60 million about a decade ago, too. I think they're about 28-29 million today. Would you like it back at the price they were bought? Yeah, I'd like those back. But I mean, that's Monday morning quarterbacking, too. What I am looking at is one of the things as well that's not really talked about or not really understood is they did a big sale on leaseback transaction two-and-a-half years ago, where they sold some distribution centers and then lease them back, and that freed up a lot of cash. That's where most of the cash that funded the buybacks. That's where most of that cash came from. They do have a history of being profitable. On a GAAP basis, they do have a history of being cash flow positive.

Do we like what's happening now? No, we don't and you shouldn't. But I think you can push through and say, long term, this has always been a company that people look at and go like, "What am I suppose to?" It's kind of a Dollarama or a Dollar store or whatever your Dollar store chain nearby view is. Kind of a Dollarama, it's a weird mirror universe. Costco, it's a treasure hunt kind of store. So you go, "Well, what is this supposed to be?" I think you have to look long-term because if you do look quarter-to-quarter. Yeah, fiscal '22 has been a tire fire, frankly. But I think that if they, A, can turn free cash flow positive, and they sounded confident in the most accurate conference call, that confidence in two bucks will buy you coffee down the street, but they at least sounded confident.

The second thing is they've talked about, and there was an activist investor here, and I full disclosure, I own a little bit of Big Lots myself. I've taken a small bet to see what happens. I've recommended it in the service I run, Hidden Gems Canada. Recommended it last April 1st. It's down about 50 percent, dividend adjusted, since I recommended it. So I guess recommending on April Fool's Day I guess is timely, but I don't abandon stocks nine months and I generally do two-plus years and then check in on the thesis. But there was an activist who was there. Now, the activist is out because the activist hedged themselves out of this one. But the activist was calling Mill Capital, I believe their name was, they were calling for another sale and leaseback of some fully owned assets, and management on the most accurate conference call did indicate they were considering that.

My point in all of this long-windedly is that that debt load could be basically gone with a well-timed sale-leaseback transaction. The other thing, too, is in the most recently completed quarter, they refinanced their credit line. That credit line had a $600 million credit line I believe beforehand. I believe it was due to mature in 2026, and they refinanced it with a $900 million credit line that is good through I believe 2027 or 2028. To go back to your original question about, is this a bankruptcy candidate? You know what? When your lenders are, A, willing to refinance and, B, deliver you more money, I think that there's a lot of people around this taking a longer-term perspective that the market is currently not taking. Certainly, the debt providers taking that longer-term perspective and willing to deliver you more money, that should speak well of long-term opportunity here, in my opinion.

Dylan Lewis: It's currently, I would say, priced for death at a 0.1 price-to-sales ratio. Big Lots is also paying a pretty heavy dividend based on the stock price, seven percent dividend. You're also hearing the CFO and conference calls explained that they're going to cut cash flow by lowering payroll. I don't know about you, but when I walk in at a discount retailer, I don't often think, boy, does this place look overstaffed. Or I guess the question here is, is this a case where you'd actually like to see management cut a little bit of the dividend to stay afloat?

Jim Gillies: I wouldn't shock me if they cut it, but again, perhaps you can accuse me of Pollyannaishness and rose-colored glasses and all this, the dividend is only about eight-and-a-half, nine million dollars a quarter. You need to be basically free cash flow, you need about $36 million annually just to cover that dividend. Like I said, last year, when they really heavily, the last fiscal year, where they really heavily spent with inventory, and now I view this year as the year which the inventory gets worked through and you have some issues. But last year they finished the year with like 80 million in free cash flow, which of course fully finances their dividend. If they do a similar behavior in the just almost completed, but of course not yet reported holiday quarter, I'm not sure there's going to be a necessity to shut down the dividend.

Again, with the refinancing, I would prefer that all cash at this point coming in that's not earmarked for the dividend, which, again, eight-and-a-half, nine million bucks a quarter, the rest of it just goes to pay down the credit line, which where they were four quarters ago. Again, hold off on the buybacks, take down the credit line, keep the dividend going, do that sale and leaseback transaction, that's probably not a bad idea at this point, which again was the activist Mill Road Capital was the main thrust of their thing. It's funny, you end up with a potential. So at about 16.5 bucks today, I think, gives them an enterprise value just over 880 million. If you look at a couple of years, and I'm looking, like they on a trailing basis are about 5.6, 5.65 billion in sales, looking out a few years, 5.7, 5.8 billion.

If they return to a more normalized, they're always trading at a low price-to-sales ratio or low price-to-valuation ratio, if they take down their dividend or take down their debt by about half over the next couple of years, they return to cash generation, maintain the dividend, and if they only get a 0.25 times price-to-sales multiple, which, again, most of the time you'd hear that level and you'd go, "No, that's ridiculous." But maintaining a four or five percent free cash flow level, you're talking about an enterprise that will probably have about a $1.4 billion total value. If you're down to 200 million in debt at that point, you're looking at about $45 stock price, $40-45 stock price if you run the math.

It's not a straight line, it's not a simple bet. There's real problems here and there's real possibilities that things get darker before the dawn, but here's a rough triple in three years, two-and-a-half years. It's not bad in my view, not a bad weighted bet. I don't mean go out and put 10 percent of your net worth in this thing because there is substantial downside risk here that isn't present with a T.J. Maxx, or with a Costco, or the Walmart, or even an Ollie's Bargain Outlet or any Dollar store of your choice. If you put in half a percent, a percent of your, I think my position is less than half a percent, so like I said, I took a small position, it's not a bad risk-reward divergence, in my opinion.

Dylan Lewis: The hurdle is low. We'll see if this company is a wet cigar butt or if there's a little bit of spark left in it. Big shoutout to David Katunaric. He had a good Substack write-up on Big Lots as well. Jim Gillies, always good chatting with you. I always appreciate talking about weird little companies with you.

Jim Gillies: That's my stock and trade, man. 

Dylan Lewis: As always, people on the program may own stocks mentioned and the Motley Fool and may have formal recommendations for or against. So don't buy or sell anything based solely on what you hear. Until next time, Fool on.

Mon, 30 Jan 2023 09:49:00 -0600 en-US text/html https://www.msn.com/en-us/money/companies/what-investors-need-to-know-about-the-latest-news-from-microsoft/ar-AA16UTjL
Killexams : Yext hops on the generative AI train with Yext Chat, an enterprise-focused chatbot

Looking to cash in on the generative AI craze, Yext, the platform for online brand management, today announced an AI-powered chatbot called Yext Chat. Taking inspiration from OpenAI’s ChatGPT, Yext Chat is designed for enterprise use cases — and differentiated, Yext claims, by a partly proprietary back end.

“ChatGPT has shown the world that large language models can hold incredibly coherent, helpful conversations — far better than any technology up to this point. But right now there is no easy way for enterprises to harness this technology,” Yext president and chief operating officer Marc Ferrentino told TechCrunch in an email interview. “Yext Chat is designed for the enterprise, and enterprises need full control over what a chatbot says and does.”

To be clear, Yext Chat wasn’t built from scratch. It relies somewhat on OpenAI’s public API, specifically GPT-3.5, to generate text and dialogue. But Yext says that it’s using a mix of text-generating models for different tasks within Yext Chat’s workflow, like marketing, commerce and customer support.

When it launches (it’s not yet generally available), Yext Chat will be able to integrate into existing platforms, including ticketing systems and Slack workspaces. A hospital could use it to power a health system that educates prospective patients on which doctor they should see and get appointments scheduled, for instance. Or a merchant could use it to service retail customers, assisting them with checking on the status of an order or answering questions about a return policy. The list goes on.

This wide swatch of capabilities makes Yext Chat superior to rival enterprise-focused chatbots, like the recently debuted Jasper Chat, which rely on a single model, Ferrentino says.

“Going forward, we will likely use some combination of OpenAI’s models as well as homegrown models trained by our own data science team for specific tasks,” he said. “Our platform is model-agnostic: it can make use of models that we have trained and manage ourselves, or models provided by third parties like OpenAI.”

Yext Chat

Image Credits: Yext

Yext Chat’s other ostensible advantage is its ability to tap into the Yext Knowledge Graph, Yext’s in-house database of public facts about brands, including employees, locations, products events, in-store promotions and even parking entrances/exits. Yext Chat only generates responses using a curated set of content that businesses manage in the Knowledge Graph, Ferrentino says — in contrast to chatbots that draw from a web-scale corpus of information, like ChatGPT.

How’s that desirable from a brand perspective? Think about it this way: If you ask ChatGPT a question like “How do I get a quote for car insurance?” it’d deliver a generic response. But a company using chat on their website wouldn’t want that. Ideally, they’d want a chatbot to deliver an accurate answer in the context of their business — one with links to their products and instructions specific to their company.

“If an answer isn’t in the Knowledge Graph, then the chatbot will simply say, ‘I don’t know.’ This can limit the chatbot in certain aspects but will make it much more suited for enterprise deployments,” Ferrentino added.

Another motivation for the data curation is to prevent Yext Chat from falling prey to misinformation, a fate ChatGPT and other generative text AI systems haven’t managed to avoid. Even the latest iterations of the tech, like Microsoft’s recently launched Bing Chat, can — and do — spout factually wrong, biased info on occasion.

Beyond limiting Yext Chat to data in the Yext Knowledge Graph, Yext claims to have implemented “other safeguards based on the latest AI safety research,” like requiring Yext Chat to explain its reasoning internally and cite its sources. (OpenAI also implements filters at the API level; Yext presumably benefits from these downstream.) In addition, Ferrentino says, Yext uses other AI models to confirm that Text Chat’s responses are factually correct based on the source data.

“Businesses need to have control over the answers that a chatbot is returning — they need to be able to update information in real-time and Improve any bad answers. Businesses also usually have a lower risk tolerance than consumers because it’s harder for them to characterize something as in ‘beta,'” Ferrentino said. “Yext’s solution is to combine the large language model with a knowledge graph that can easily be updated in real time.”

No system is perfect, Bing Chat being a prime example. But Yext’s approach — placing constraints on Yext Chat and making it incumbent on businesses to update their information — seems more carefully considered than most. How well will it work in practice? Time will tell — Yext plans to launch Yext Chat “later this year” following a closed beta.

Wed, 15 Feb 2023 04:21:00 -0600 en-US text/html https://techcrunch.com/2023/02/15/yext-hops-on-the-generative-ai-train-with-yext-chat-an-enterprise-focused-chatbot/
Killexams : Top AI startup news of the week: There’s more to AI than just Microsoft and Google

Check out all the on-demand sessions from the Intelligent Security Summit here.


The headlines this week in the world of AI were dominated with talk of a new search engine war, with Microsoft and its erstwhile partner OpenAI, master of the phenom that is ChatGPT, pitched against stalwart Google and its prosaic Bard.

While the industry titans clash, it’s important to realize that there are a lot more players in the AI space that are creating innovations. This past week was no exception — with a healthy dose of activity and even a little magic, some “gloss” (AI), and even a Moonhub.

1. Is it AI or is it Magic?

There are those who might feel as though modern AI is some form of magic. While that’s not the technical truth, there is at least one startup that is hoping to bring a little magic to the world of AI.

San Francisco-based startup Magic announced a $23 million round of funding this week. Magic is building out technology for software engineering that it refers to as an “AI colleague.” The technology uses natural language to help programmers better understand and collaborate on software development.

Event

Intelligent Security Summit On-Demand

Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.

Watch Here

“Our mission is to deploy AI to accelerate science and make the world more productive. For decades, technology was just a tool; soon it will be a colleague,” Magic cofounder and CEO Eric Steinberger said in a statement. “The adoption of ‘AI assistants’ in the workplace will be as impactful as the Industrial Revolution, and it’s important to get this transition right.” 

2. GlossAi raises $8 million for generative video-AI (V-AI)

Generative AI is all the rage for text and images, and it’s beginning to make waves in the video world as well. Tel-Aviv, Israel-based GlossAi announced this week that it has raised $8 million in a seed round of funding.

The promise of GlossAi is an AI technology that can intelligently help organizations quickly generate video content in a fraction of the time it might have taken with more traditional approaches to video creation. According to the company, its technology, “analyzes hundreds of parameters such as text, tone, pace, facial expressions and audience engagement using billions of data points, achieving its human-like content generation in near-real time.”

3. Kognitos brings in $6.8 million to develop generative AI to enterprise automation

Have we mentioned that generative AI is hot? Kognitos is yet another generative AI startup making news this week, raising a $6.8 million round of seed funding.

Kognitos is using generative AI to build out an enterprise automation platform called Koncierge. The platform is intended to enable the simple creation of what would otherwise be complex-to-produce business processes.

“We are unlocking the power of AI for humanity. Now every person will be able to use Generative AI for automating what they want — with just English,” Binny Gill, founder and CEO of Kognitos, wrote in a statement. “It’s time for computers to behave like humans, and humans to stop behaving like machines.” 

4. Moonhub wants to help Improve the hiring process with AI, raising $4.4 million seed

There is no shortage of startups that have been covered here on VentureBeat that are actively trying to Improve the hiring process. Moonhub can now be added to that list.

Moonhub raised a $4.4 million seed round this week to help the company build out its talent management and recruiting technology. And yes, Moonhub also claims to be using “cutting-edge generative AI” according to its FAQ, to help organizations hire talent faster and more successfully.

5. Mattiq raises $15 million seed funding to develop sustainable materials with AI

Chicago based-Mattiq is using its AI technology for materials science in a bid to help organizations create a new generation of sustainable materials. The company announced this week that it has raised $15 million in a seed round of funding.

Mattiq has set quite an audacious goal for itself: forecasting that by 2024, the company will have synthesized and analyzed more than one trillion novel material combinations with the help of its advanced materials AI technology.

“From the Stone Age to the silicon era, materials discovery has been slow, unpredictable, and constrained by the performance of available materials,” said Mattiq founder and director Chad Mirkin in a statement. “Mattiq is disrupting this status quo, resulting in discoveries that enable new technologies at a pace not previously imaginable.” 

6. Entropik seeks to boost AI-powered market research with $25 million series B

AI is also making an impact in the work of market research. 

This week Bengaluru, India-based Entropik announced that it has raised $25 million in a series B round of funding. The company is looking to use the money to expand across the U.S., European and Asian markets.

Entropik’s technology helps organizations better understand consumer preferences. It does that with AI innovations that include numerous patents for eye tracking as well as facial coding. The company claims that its facial coding technology, which captures a user’s facial expressions, can be quantified for accurate sentiment analysis.

7. Customer experience AI vendor Ushur raises $50 million series C

Santa Clara, California-based Ushur is looking to help Improve its customer experience automation (CXA) platform with the help of its $50 million series C round of funding that was announced this week.

The Ushur platform uses AI for a variety of processes and has an approach it calls “Conversational Apps” that provide a no-code approach to building processes that involve conversations between users and a company. The apps make use of AI for natural language processing as well as automation.

“The previous generation of enterprise automation was designed for infrastructure processes,” Simha Sadasiva, CEO and cofounder of Ushur, said in a statement. “We built Ushur’s AI platform with a different goal in mind: to provide excellent customer experiences at scale and to deliver meaningful interactions that put the customer’s needs at the center.”

8. MindsDB wants to help organization get more machine learning into applications

MindsDB announced that it raised $16.5 million series A funding this week as it continues to build out its open-source based machine learning (ML) platform that helps organizations build their own AI-powered applications.

VentureBeat first covered MindsDB back in 2020 when it raised its $3 million seed round. The company has grown in the years since, benefitting from integrations with OpenAI and Hugging Face to help bring both generative AI and natural language processing capability into a database-driven application.

“Today, there’s tremendous interest in the developer community to implement and integrate machine learning into their applications, but the process is complicated and expensive,” commented Chetan Puttagunta, general partner at Benchmark in a statement. “MindsDB is enabling developers from small startups to the biggest enterprises in the world by enabling developers to quickly and efficiently run ML models of their choosing with the database of their choosing.”

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Fri, 10 Feb 2023 11:21:00 -0600 en-US text/html https://venturebeat.com/ai/top-ai-startup-news-of-the-week-theres-more-to-ai-than-just-microsoft-and-google/
Killexams : CISOs: Self-healing endpoints key to consolidating tech stacks, improving cyber-resiliency

Check out all the on-demand sessions from the Intelligent Security Summit here.


Self-healing endpoint platform providers are under pressure to create new solutions to help CISOs consolidate tech stacks while improving cyber-resiliency. CISOs see the potential of self-healing platforms to reduce costs, increase visibility and capture real-time data that quantifies how cyber-resilient they are becoming. And reducing costs while increasing cyber-resilience is the risk profile their boards of directors want.  

A self-healing endpoint is one that combines self-diagnostics with the adaptive intelligence to identify a suspected or actual breach attempt and take immediate action to stop it. Self-healing endpoints can shut themselves off, complete a re-check of all OS and application versioning, and then reset themselves to an optimized, secure configuration — all autonomously with no human intervention. 

Gartner predicts that enterprise end-user spending for endpoint protection platforms will soar from $9.4 billion in 2020 to $25.8 billion in 2026, attaining a compound annual growth rate of 15.4%. Gartner also predicts that by the end of 2025, more than 60% of enterprises will have replaced older antivirus products with combined endpoint protection platform (EPP) and endpoint detection and response (EDR) solutions that supplement prevention with detection and responseBut self-healing endpoint vendors need to accelerate innovation for the market to reach its full potential.

Absolute Software’s accurate company overview presentation provides an insightful analysis of the self-healing endpoint market from the perspective of an industry pioneer in endpoint resilience, visibility and control. Absolute has grown from 12,000 customers in fiscal year 2019 to 18,000 in fiscal year 2023.

Event

Intelligent Security Summit On-Demand

Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.

Watch Here
Absolute Software: False Sense of Security — endpoint attack statistics
Absolute Software’s approach is unique in its reliance on firmware-embedded persistence as the basis of self-healing. The company’s approach provides an undeletable digital tether for every PC-based endpoint. Source: Absolute Software Company Overview Presentation, November 2022

Mining telemetry data to Improve resilience 

Self-healing endpoint platform providers need to mine their telemetry data and use it to accelerate their initiatives. Industry-leading executives, including CrowdStrike co-founder, president and CEO George Kurtz, see this as essential to finding new ways to Improve detections.

“One of the areas that we’ve pioneered is the fact that we can take weak signals from across different endpoints,” he said at the company’s annual Fal.Con event last year. “And we can link these together to find novel detections. We’re now extending that to our third-party partners so that we can look at other weak signals across not only endpoints but across domains and come up with a novel detection.”  

Nikesh Arora, Palo Alto Networks chairman and CEO, remarked during his keynote at Palo Alto Networks‘ Ignite ’22 conference that “we collect the most … endpoint data in the industry from our XDR. We collect almost 200 megabytes per endpoint, which is, in many cases, 10 to 20 times more than most of the industry participants. Why do [we] do that? Because we take that raw data and cross-correlate or enhance most of our firewalls; we apply attack surface management with applied automation using XDR.”  

The first benchmark every enterprise IT and cybersecurity team needs to use in evaluating self-healing endpoint providers is their efficiency in mining all telemetry data. From datasets generated from attacks to continuous monitoring, using telemetry data to Improve current services and create new ones is critical. How effectively a vendor uses telemetry data to keep innovating is a decisive test of how well its product management, customer success, network operations and security functions are working together. Success in this area indicates that a self-healing endpoint vendor is committed to excelling at innovation.

At last count, over 500 endpoint security vendors offer endpoint detection and response (EDR), extended detection and response (XDR), endpoint management, endpoint protection platforms and/or endpoint protection suites

While most claim to have self-healing endpoints, 40% or less have implemented them at scale over multiple product generations.

Today, the leading providers with enterprise customers using their self-healing endpoints include Absolute Software, Cisco, CrowdStrike, Cybereason Defense Platform, ESET, Ivanti, Malwarebytes, Microsoft Defender 365, Sophos and Trend Micro.  

How consolidating tech stacks is driving innovation

CISOs’ need to consolidate tech stacks is being driven by the challenge of closing growing security gaps, reducing risks and improving digital dexterity while reducing costs and increasing visibility. Those challenges create the perfect opportunity for self-healing endpoint vendors. Here are the areas where self-healing endpoint vendors are innovating the fastest:

Consolidation is driving XDR into the mainstream

XDR platforms are designed to integrate at scale across all available data sources in an enterprise, relying on APIs and an open architecture to aggregate and analyze telemetry data in real time. XDR platforms are strengthening self-healing endpoint platforms by providing the telemetry data needed to Improve behavioral monitoring, threat detection and response, as well as identify potential new product and service ideas. Leading self-healing endpoint security vendors, including CrowdStrike, see XDR as fundamental to the future of endpoint security and zero trust.

Gartner defines XDR as a “unified security incident detection and response platform that automatically collects and correlates data from multiple proprietary security components.” CrowdStrike and other vendors are continually developing their XDR platforms to reduce application sprawl while removing the roadblocks that get in the way of preventing, detecting and responding to cyberattacks.

XDR is also core to CrowdStrike’s consolidation strategy and the similar strategy Palo Alto Networks launched at the companies’ respective annual customer events in 2022.

CrowdStrike: XDR Architecture
An XDR platform unifies detection and response across a security tech stack, delivering a command console for unified detection and response beyond endpoints. Creating an XDR enables security analysts to investigate, threat hunt, and respond faster and intuitively to events. Source: CrowdStrike

Self-healing endpoints need automated patch management scaleable to thousands of units simultaneously

CISOs told VentureBeat that their most urgent requirement for self-healing endpoints is the ability to update thousands of endpoints in real time and at scale. IT, ITSM and security teams face chronic time shortages today. Taking an inventory approach to keeping endpoints up-to-date with patches is considered impractical and a waste of time.

What CISOs are looking for was articulated by Srinivas Mukkamala, chief product officer at Ivanti, during a accurate interview with VentureBeat. “Endpoint management and self-healing capabilities allow IT teams to discover every device on their network, and then manage and secure each device using modern, best-practice techniques that ensure end users are productive and company resources are safe,” Srinivas said.

He continued, “Automation and self-healing Improve employee productivity, simplify device management and Improve security posture by providing complete visibility into an organization’s entire asset estate and delivering automation across a broad range of devices.”  

There’s been a significant amount of innovation in this area, including Ivanti’s launch of an AI-based patch intelligence system. Its Neurons Patch for Microsoft Endpoint Configuration Monitor (MEM) is noteworthy. It’s built using a series of AI-based bots to seek out, identify and update all patches across endpoints that need to be updated.

Other vendors providing AI-based endpoint protection include Broadcom, CrowdStrikeSentinelOne, McAfeeSophos, Trend MicroVMWare Carbon Black and Cybereason.

Silicon-based self-healing endpoints are the most difficult for attackers to defeat

Just as enterprises trust silicon-based zero-trust security over quantum computing, the same holds for self-healing embedded in an endpoint’s silicon. Forrester analyzed just how valuable self-healing in silicon is in its report, The Future of Endpoint Management. Forrester’s Andrew Hewitt, the report’s author, says that “self-healing will need to occur at multiple levels: 1) application; 2) operating system; and 3) firmware. Of these, self-healing embedded in the firmware will prove the most essential because it will ensure that all the software running on an endpoint, even agents that conduct self-healing at an OS level, can effectively run without disruption.” 

Forrester interviewed enterprises with standardized self-healing endpoints that rely on firmware-embedded logic to reconfigure themselves autonomously. Its study found that Absolute’s reliance on firmware-embedded persistence delivers a secured, undeletable digital tether to every PC-based endpoint. Organizations told Forrester that Absolute’s Resilience platform is noteworthy in providing real-time visibility and control of any device, on a network or not, along with detailed asset management data.

Absolute also has the industry’s first self-healing zero-trust platform that provides asset management, device and application control, endpoint intelligence, incident reporting, resilience and compliance.

Endpoint Self-Healing Must Occur At Three Primary Levels
For modern endpoint management platforms to be effective, they must offer self-healing capabilities at the application, operating system and firmware levels, per Forrester. Source: Forrester, The Future of Endpoint Management Report.

CISOs look to endpoints first when consolidating tech stacks 

It seems counterintuitive that CISOs are spending more on endpoints, and encouraging their proliferation across their infrastructures, at a time when company budgets are tight. But digital transformation initiatives that could create new revenue streams, combined with customers changing how, where and why they buy, are driving an exponential jump in the type and number of endpoints.

Endpoints are a catalyst for driving more revenue and are core to making ecommerce succeed. “They’re the transaction hub that every dollar passes through, and [that] every hacker wants to control,” remarked one CISO whom VentureBeat recently interviewed.

However, enterprises and the CISOs running them are losing the war against cyberattackers at the endpoint. Endpoints are commonly attacked several thousand times a day with automated scripts — AI and ML-based hacking algorithms that seek to defeat and destroy endpoints. Self-healing endpoints’ importance can’t be overstated, as they provide invaluable real-time data management while securing assets and, when combined with microsegmentation, eliminating attackers’ ability to move laterally across networks.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Tue, 14 Feb 2023 12:32:00 -0600 en-US text/html https://venturebeat.com/security/cisos-self-healing-endpoints-key-to-consolidating-tech-stacks-improving-cyber-resiliency/
Killexams : Microsoft and NVIDIA experts talk AI infrastructure

This post has been co-authored by Sheila Mueller, Senior GBB HPC+AI Specialist, Microsoft; Gabrielle Davelaar, Senior GBB AI Specialist, Microsoft; Gabriel Sallah, Senior HPC Specialist, Microsoft; Annamalai Chockalingam, Product Marketing Manager, NVIDIA; J Kent Altena, Principal GBB HPC+AI Specialist, Microsoft; Dr. Lukasz Miroslaw, Senior HPC Specialist, Microsoft; Uttara Kumar, Senior Product Marketing Manager, NVIDIA; Sooyoung Moon, Senior HPC + AI Specialist, Microsoft.

As AI emerges as a crucial tool in so many sectors, it’s clear that the need for optimized AI infrastructure is growing. Going beyond just GPU-based clusters, cloud infrastructure that provides low-latency, high-bandwidth interconnects, and high-performance storage can help organizations handle AI workloads more efficiently and produce faster results.

HPCwire recently sat down with Microsoft Azure and NVIDIA’s AI and cloud infrastructure certified and asked a series of questions to uncover AI infrastructure insights, trends, and advice based on their engagements with customers worldwide.

How are your most interesting AI use cases dependent on infrastructure?

Sheila Mueller, Senior GBB HPC+AI Specialist, Healthcare & Life Sciences, Microsoft: Some of the most interesting AI use cases are in-patient health care, both clinical and research. Research in science, engineering, and health is creating significant improvements in patient care, enabled by high-performance computing and AI insights. Common use cases include molecular modeling, therapeutics, genomics, and health treatments. Predictive Analytics and AI coupled with cloud infrastructure purpose-built for AI are the backbone for improvements and simulations in these use cases and can lead to a faster prognosis and the ability to research cures. See how Elekta brings hope to more patients around the world with the promise of AI-powered radiation therapy.

Gabrielle Davelaar, Senior GBB AI Specialist, Microsoft: Many manufacturing companies need to train inference models at scale while being compliant with strict local and European-level regulations. AI is placed on the edge with high-performance compute. Full traceability with strict security rules on privacy and security is critical. This can be a tricky process as every step must be recorded for reproduction, from simple things like dataset versions to more complex things such as knowing which environment was used with what machine learning (ML) libraries with its specific versions. Machine learning operations (MLOps) for data and model auditability now make this possible. See how BMW uses machine learning-supported robots to provide flexibility in quality control for automotive manufacturing.

Gabriel Sallah, Senior HPC Specialist, Automotive Lead, Microsoft: We’ve worked with car makers to develop advanced driver assistance systems (ADAS) and advanced driving systems (ADS) platforms in the cloud using integrated services to build a highly scalable deep learning pipeline for creating AI/ML models. HPC techniques were applied to schedule, scale, and provision compute resources while ensuring effective monitoring, cost management, and data traceability. The result: faster simulation/training times due to the close integration of data inputs, compute simulation/training runs, and data outputs than their existing solutions.

Annamalai Chockalingam, Product Marketing Manager, Large Language Models & Deep Learning Products, NVIDIA: Progress in AI has led to the explosion of generative AI, particularly with advancements to large language models (LLMs) and diffusion-based transformer architectures. These models now recognize, summarize, translate, predict, and generate languages, images, videos, code, and even protein sequences, with little to no training or supervision, based on massive datasets. Early use cases include improved customer experiences through dynamic virtual assistants, AI-assisted content generation for blogs, advertising, marketing, and AI-assisted code generation. Infrastructure purpose-built for AI that can handle computer power and scalability demands is key.

What AI challenges are customers facing, and how does the right infrastructure help?

John Lee, Azure AI Platforms & Infrastructure Principal Lead, Microsoft: When companies try to scale AI training models beyond a single node to tens and hundreds of nodes, they quickly realize that AI infrastructure matters. Not all accelerators are alike. Optimized scale-up node-level architecture matters. How the host CPUs connect to groups of accelerators matter. When scaling beyond a single node, the scale-out architecture of your cluster matters. Selecting a cloud partner that provides AI-optimized infrastructure can be the difference between an AI project’s success or failure. Read the blog: AI and the need for purpose-built cloud infrastructure.

Annamalai Chockalingam: AI models are becoming increasingly powerful due to a proliferation of data, continued advancements in GPU compute infrastructure, and improvements in techniques across both training and inference of AI workloads. Yet, combining the trifecta of data, compute infrastructure, and algorithms at scale remains challenging. Developers and AI researchers require systems and frameworks that can scale, orchestrate, crunch mountains of data, and manage MLOps to optimally create deep learning models. End-to-end tools for production-grade systems incorporating fault tolerance for building and deploying large-scale models for specific workflows are scarce.

Kent Altena, Principal GBB HPC+AI Specialist, Financial Services, Microsoft: Trying to decide the best architectures between the open flexibility of a true HPC environment to the robust MLOps pipeline and capabilities of machine learning. Traditional HPC approaches, whether scheduled by a legacy scheduler like HPC Pack or SLURM or a cloud-native scheduler like Azure Batch, are great for when they need to scale to hundreds of GPUs, but in many cases, AI environments need the DevOps approach to AI model management and control of which models are authorized or conversely need overall workflow management.

Dr. Lukasz Miroslaw, Senior HPC Specialist, Microsoft: AI infrastructure is not only the GPU-based clusters but also low-latency, high-bandwidth interconnect between the nodes and high-performant storage. The storage requirement is often the limiting factor for large-scale distributed training as the amount of data used for the training in autonomous driving projects can grow to petabytes. The challenge is to design an AI platform that meets strict requirements in terms of storage throughput, capacity, support for multiple protocols, and scalability.

What are the most frequently asked questions about AI infrastructure?

John Lee: “Which platform should I use for my AI project/workload?” There is no single magic product or platform that is right for every AI project. Customers usually have a good understanding of what answers they are looking for but aren’t sure what AI products or platforms will get them that answer the fastest, most economical, and scalable way. A cloud partner with a wide portfolio of AI products, solutions, and expertise can help find the right solution for specific AI needs.

Uttara Kumar, Senior Product Marketing Manager, NVIDIA: “How do I select the right GPU for our AI workloads?” Customers want the flexibility to provision the right-sized GPU acceleration for different workloads to optimize cloud costs (fractional GPU, single GPU, multiple GPUs all the way up to multiple GPUs across multi-node clusters). Many also ask, “How do you make the most of the GPU instance/virtual machines and leverage it within applications/solutions?” Performance-optimized software is key to doing that.

Sheila Mueller: “How do I leverage the cloud for AI and HPC while ensuring data security and governance.” Customers want to automate the deployment of these solutions, often across multiple research labs with specific simulations. Customers want a secure, scalable platform that provides control over data access to provide insight. Cost management is also a focus in these discussions.

Kent Altena: “How best should we implement this GPU to run our GPUs?” We know what we need to run and have built the models, but we also need to understand the final mile. The answer is not always a straightforward one-size-fits-all answer. It requires understanding their models, what they are attempting to solve, and what their inputs and outputs/workflow looks like.

What have you learned from customers about their AI infrastructure needs?

John Lee: The majority of customers want to leverage the power of AI but are struggling to put an actionable plan in place to do so. They worry about what their competition is doing and whether they are falling behind but, at the same time, are not sure what first steps to take on their journey to integrate AI into their business.

Annamalai Chockalingam: Customers are looking for AI solutions to Improve operational efficiency and deliver innovative solutions to their end customers. Easy-to-use, performant, platform-agnostic, and cost-effective solutions across the compute stack are incredibly desirable to customers.

Gabriel Sallah: All customers are looking to reduce the cost of training an ML model. Thanks to the flexibility of the cloud resources, customers can select the right GPU, storage I/O, and memory configuration for the given training model.

Gabrielle Davelaar: Costs are critical. With the current economic uncertainty, companies need to do more with less and want their AI training to be more efficient and effective. Something a lot of people are still not realizing is that training and inferencing costs can be optimized through the software layer.

What advice would you deliver to businesses looking to deploy AI or speed innovation?

Uttara Kumar: Invest in a platform that is performant, versatile, scalable, and can support the end-to-end workflow—start to finish—from importing and preparing data sets for training, to deploying a trained network as an AI-powered service using inference.

John Lee: Not every AI solution is the same. AI-optimized infrastructure matters, so be sure to understand the breadth of products and solutions available in the marketplace. And just as importantly, make sure you engage with a partner that has the expertise to help navigate the complex menu of possible solutions that best match what you need.

Sooyoung Moon, Senior HPC + AI Specialist, Microsoft: No amount of investment can guarantee success without thorough early-stage planning. Reliable and scalable infrastructure for continuous growth is critical.

Kent Altena: Understand your workflow first. What do you want to solve? Is it primarily a calculation-driven solution, or is it built upon a data graph-driven workload? Having that in mind will go a long way to determining the best or optimal approach to start down.

Gabriel Sallah: What are the dependencies across various teams responsible for creating and using the platform? Create an enterprise-wide architecture with common toolsets and services to avoid duplication of data, compute monitoring, and management.

Sheila Mueller: Involve stakeholders from IT and Lines of Business to ensure all parties agree to the business benefits, technical benefits, and assumptions made as part of the business case.

Learn more about Azure and NVIDIA

Thu, 09 Feb 2023 02:22:00 -0600 en-US text/html https://www.msn.com/en-us/news/technology/microsoft-and-nvidia-experts-talk-ai-infrastructure/ar-AA17ikMB
DP-500 exam dump and training guide direct download
Training Exams List