Pass SPLK-1003 exam with SPLK-1003 VCE and PDF Dumps

If you are needy and interested by efficiently Passing the Splunk SPLK-1003 exam to boost your carrier, Actually has exact Splunk Enterprise Certified Admin exam questions with a purpose to ensure that you pass SPLK-1003 exam! offers you the legit, latest up to date SPLK-1003 PDF Dumps with a 100% money back guarantee.

SPLK-1003 Splunk Enterprise Certified Admin approach |

SPLK-1003 approach - Splunk Enterprise Certified Admin Updated: 2024

Survey SPLK-1003 real question and answers before you step through exam
Exam Code: SPLK-1003 Splunk Enterprise Certified Admin approach January 2024 by team

SPLK-1003 Splunk Enterprise Certified Admin

The Splunk Enterprise Certified Admin exam is the final step towards completion of
the Splunk Enterprise Certified Admin certification. This upper-level certification exam is a 57-minute,
63-question assessment which evaluates a candidates knowledge and skills to manage various
components of Splunk on a daily basis, including the health of the Splunk installation. Candidates can
expect an additional 3 minutes to review the exam agreement, for a total seat time of 60 minutes. It is
recommended that candidates for this certification complete the lecture, hands-on labs, and quizzes
that are part of the Splunk Enterprise System Administration and Splunk Enterprise Data Administration
courses in order to be prepared for the certification exam. Splunk Enterprise Certified Admin is a
required prerequisite to the Splunk Enterprise Certified Architect and Splunk Certified Developer
certification tracks.

The Splunk Enterprise System Administration course focuses on administrators who manage a Splunk
Enterprise environment. courses include Splunk license manager, indexers and search heads,
configuration, management, and monitoring. The Splunk Enterprise Data Administration course targets
administrators who are responsible for getting data into Splunk. The course provides content about
Splunk forwarders and methods to get remote data into Splunk.

The following content areas are general guidelines for the content to be included on the exam:

● Splunk deployment overview

● License management

● Splunk apps

● Splunk configuration files

● Users, roles, and authentication

● Getting data in

● Distributed search

● Introduction to Splunk clusters

● Deploy forwarders with Forwarder Management

● Configure common Splunk data inputs

● Customize the input parsing process

1.0 Splunk Admin Basics 5%

1.1 Identify Splunk components

2.0 License Management 5%

2.1 Identify license types

2.2 Understand license violations

3.0 Splunk Configuration Files 5%

3.1 Describe Splunk configuration directory structure

3.2 Understand configuration layering

3.3 Understand configuration precedence

3.4 Use btool to examine configuration settings

4.0 Splunk Indexes 10%

4.1 Describe index structure

4.2 List types of index buckets

4.3 Check index data integrity

4.4 Describe indexes.conf options

4.5 Describe the fishbucket

4.6 Apply a data retention policy

5.0 Splunk User Management 5%

5.1 Describe user roles in Splunk

5.2 Create a custom role

5.3 Add Splunk users

6.0 Splunk Authentication Management 5%

6.1 Integrate Splunk with LDAP

6.2 List other user authentication options

6.3 Describe the steps to enable Multifactor Authentication in Splunk

7.0 Getting Data In 5%

7.1 Describe the basic settings for an input

7.2 List Splunk forwarder types

7.3 Configure the forwarder

7.4 Add an input to UF using CLI

8.0 Distributed Search 10%

8.1 Describe how distributed search works

8.2 Explain the roles of the search head and search peers

8.3 Configure a distributed search group

8.4 List search head scaling options

9.0 Getting Data In – Staging 5%

9.1 List the three phases of the Splunk Indexing process

9.2 List Splunk input options

10.0 Configuring Forwarders 5%

10.1 Configure Forwarders

10.2 Identify additional Forwarder options

11.0 Forwarder Management 10%

11.1 Explain the use of Deployment Management

11.2 Describe Splunk Deployment Server

11.3 Manage forwarders using deployment apps

11.4 Configure deployment clients

11.5 Configure client groups

11.6 Monitor forwarder management activities

12.0 Monitor Inputs 5%

12.1 Create file and directory monitor inputs

12.2 Use optional settings for monitor inputs

12.3 Deploy a remote monitor input

13.0 Network and Scripted Inputs 5%

13.1 Create network (TCP and UDP) inputs

13.2 Describe optional settings for network inputs

13.3 Create a basic scripted input

14.0 Agentless Inputs 5%

14.1 Identify Windows input types and uses

14.2 Describe HTTP Event Collector

15.0 Fine Tuning Inputs 5%

15.1 Understand the default processing that occurs during input phase

15.2 Configure input phase options, such as sourcetype fine-tuning and character set encoding

16.0 Parsing Phase and Data 5%

16.1 Understand the default processing that occurs during parsing

16.2 Optimize and configure event line breaking

16.3 Explain how timestamps and time zones are extracted or assigned to events

16.4 Use Data Preview to validate event creation during the parsing phase

17.0 Manipulating Raw Data 5%

17.1 Explain how data transformations are defined and invoked

17.2 Use transformations with props.conf and transforms.conf to:

● Mask or delete raw data as it is being indexed

● Override sourcetype or host based upon event values

● Route events to specific indexes based on event content

● Prevent unwanted events from being indexed

17.3 Use SEDCMD to modify raw data

Splunk Enterprise Certified Admin
Splunk Enterprise approach

Other Splunk exams

SPLK-1003 Splunk Enterprise Certified Admin
SPLK-1001 Splunk Core Certified User
SPLK-2002 Splunk Enterprise Certified Architect
SPLK-3001 Splunk Enterprise Security Certified Admin
SPLK-1002 Splunk Core Certified Power User
SPLK-3003 Splunk Core Certified Consultant
SPLK-2001 Splunk Certified Developer
SPLK-1005 Splunk Cloud Certified Admin
SPLK-2003 Splunk SOAR Certified Automation Developer
SPLK-4001 Splunk O11y Cloud Certified Metrics User
SPLK-3002 Splunk IT Service Intelligence Certified Admin

Some people have really good knowledge of SPLK-1003 exam courses but still they fail in the exam. Why? Because, real SPLK-1003 exam has many tricks that are not written in the books. Our SPLK-1003 dumps questions contain real exam scenarios with vce exam simulator for you to practice and pass your exam with high scores or your money back.
SPLK-1003 Dumps
SPLK-1003 Braindumps
SPLK-1003 Real Questions
SPLK-1003 Practice Test
SPLK-1003 dumps free
Splunk Enterprise Certified Admin
Question: 147
Within props.conf, which stanzas are valid for data modification? (Choose all that apply.)
A. Host
B. Server
C. Source
D. Sourcetype
Answer: CD
Question: 148
Within props.conf, which stanzas are valid for data modification? (Choose all that apply.)
A. Host
B. Server
C. Source
D. Sourcetype
Answer: CD
Question: 149
Within props.conf, which stanzas are valid for data modification? (Choose all that apply.)
A. Host
B. Server
C. Source
D. Sourcetype
Answer: CD
Question: 150
This file has been manually created on a universal forwarder:
A new Splunk admin comes in and connects the universal forwarders to a deployment server and deploys the same app with a new inputs.conf file:
Which file is now monitored?
A. /var/log/messages
B. /var/log/maillog
C. /var/log/maillogand /var/log/messages
D. none of the above
Answer: A
Question: 151
Which forwarder type can parse data prior to forwarding?
A. Universal forwarder
B. Heaviest forwarder
C. Hyper forwarder
D. Heavy forwarder
Answer: D
Question: 152
In which Splunk configuration is the SEDCMDused?
A. props.conf
B. inputs.conf
C. indexes.conf
D. transforms.conf
Answer: A
Question: 153
In which phase of the index time process does the license metering occur?
A. Input phase
B. Parsing phase
C. Indexing phase
D. Licensing phase
Answer: C
Question: 154
When running the command shown below, what is the default path in which deploymentserver.conf is created? splunk set deploy-poll deployServer:port
A. SPLUNK_HOME/etc/deployment
B. SPLUNK_HOME/etc/system/local
C. SPLUNK_HOME/etc/system/default
D. SPLUNK_HOME/etc/apps/deployment
Answer: B
Question: 155
In case of a conflict between a whitelist and a blacklist input setting, which one is used?
A. Blacklist
B. Whitelist
C. They cancel each other out.
D. Whichever is entered into the configuration first.
Answer: A
Reference: sa=t&rct=j&q=&esrc=s&source=web&cd=8&ved=2ahUKEwj0r6Lso6bkAhUqxYUKHbWlDz4QFjAHegQIAxAC&
Question: 156
The priority of layered Splunk configuration files depends on the files:
A. Owner
B. Weight
C. Context
D. Creation time
Answer: C
Question: 157
Which of the following are supported configuration methods to add inputs on a forwarder? (Select all that apply.)
B. Edit inputs.conf
C. Edit forwarder.conf
D. Forwarder Management
Answer: AB
Question: 158
Which parent directory contains the configuration files in Splunk?
D. $SPLUNK_HOME/default
Answer: A
Question: 159
Where should apps be located on the deployment server that the clients pull from?
A. $SPLUNK_HOME/etc/apps
B. $SPLUNK_HOME/etc/search
C. $SPLUNK_HOME/etc/master-apps
D. $SPLUNK_HOME/etc/deployment-apps
Answer: A
Question: 160
Which Splunk component consolidates the individual results and prepares reports in a distributed environment?
A. Indexers
B. Forwarder
C. Search head
D. Search peers
Answer: A
Question: 161
Which Splunk component distributes apps and certain other configuration updates to search head cluster members?
A. Deployer
B. Cluster master
C. Deployment server
D. Search head cluster master
Answer: A
Question: 162
You update a props.conffile while Splunk is running. You do not restart Splunk and you run this command: splunk btool props list C-debug.
What will the output be?
A. A list of all the configurations on-disk that Splunk contains.
B. A verbose list of all configurations as they were when splunkd started.
C. A list of props.confconfigurations as they are on-disk along with a file path from which the configuration is located.
D. A list of the current running props.conf configurations along with a file path from which the configuration was made.
Answer: D
Question: 163
Which setting in indexes.confallows data retention to be controlled by time?
A. maxDaysToKeep
B. moveToFrozenAfter
C. maxDataRetentionTime
D. frozenTimePeriodInSecs
Answer: D
Question: 164
The universal forwarder has which capabilities when sending data? (Select all that apply.)
A. Sending alerts
B. Compressing data
C. Obfuscating/hiding data
D. Indexer acknowledgement
Answer: D
For More exams visit
Kill your exam at First Attempt....Guaranteed!

Splunk Enterprise approach - BingNews Search results Splunk Enterprise approach - BingNews To Make the Splunk Acquisition Successful, a New Approach to Storage is Needed

Cisco’s acquisition of Splunk in September generated a lot of commentary, most of which unsurprisingly focused on how the two companies complement each other and what this means for their respective customers. 

As Cisco CEO Chuck Robbins stated when announcing the purchase, “our combined capabilities will drive the next generation of AI-enabled security and observability. From threat detection and response to threat prediction and prevention, we will help make organizations of all sizes more secure and resilient.”

From the product perspective, it is clear that the synergies are substantial. Cisco sells hardware that generates massive amounts of data and Splunk is the category leader for data-intensive observability and security information and event management (SIEM) products.

Viewed from the industry perspective, Splunk’s acquisition fits a distinct pattern.This transaction represents  the fifth this year for an observability platform following changes of control at Moogsoft, Ops Ramp, Sumo Logic and New Relic.

In all cases, including PE firm Francisco Partner’s takeover of both New Relic and Sumo Logic, the aim is clear: use the data these companies collect to fuel the next big wave of AI-powered operations and security tools. 

However, this next generation of AI-enabled tools faces a significant challenge: AI is data-hungry and requires always-hot storage, which is likely to be prohibitively expensive on current platforms.

This fundamental economic challenge confronts not just Cisco, but also HPE (Ops Ramp), Dell (Moogsoft), and Francisco Partners, as they attempt to make good on this AI-driven vision. It is possible, unless architectures change, that the high cost of storing and using data in these platforms, and the tradeoffs these costs impose, will impede the building of AI-enabled products.

AI is Data Hungry

With a few caveats, it is safe to say that more data makes for better AI models and, by extension, AI-enabled products. Larger training sets translate into greater accuracy, the ability to detect subtle patterns, and most importantly for the use cases envisioned by Cisco, generalization accuracy. Generalization describes how well a model can analyze and make accurate predictions on new data. For security use cases this can mean the difference between detecting or failing to detect a cyber threat.

But it’s not just enough to have a lot of data at hand. That data needs to be easy to access repeatedly and on a basically ad hoc basis. That’s because the process of building and training models is experimental and iterative. 

In data storage terms, AI use cases require hot data. And when it comes to platforms like Splunk, that’s a problem.

In AI, All Data Must Be Hot

To minimize costs, data on today’s leading SIEM and observability platforms is stored in hot and cold tiers.  

Hot storage is for data that must be accessed frequently and requires fast or low-latency query responses. This could be anything from customer databases to Kubernetes logs. It is data used in the daily operation of an application. 

Cold storage, on the other hand, serves as a low-cost archive. But in order to achieve this cost savings, performance is sacrificed. Cold data is slow to access and difficult to query. To be usable, cold data must be transferred back to the hot storage tier, which can take hours or even days. Cold storage simply won’t work for AI use cases.

Data science teams use data in three phases: exploratory analysis, feature engineering and training, and maintenance of deployed models, each of which is characterized by constant refinement through experimentation. Each phase is highly iterative, as is the entire process.

Anything that slows down these iterations, increases costs, or otherwise creates operational friction – and restoring data from cold storage does all three – will negatively impact the quality of AI-enabled products. 

The High Cost of Storage Forces Tradeoffs

It is no surprise to anyone paying attention to the industry that Splunk, like its competitors, is perceived as expensive. It was a top concern of customers before the acquisition and it remains the number one concern in surveys taken since. It is easy to see why. Though their pricing is somewhat opaque, estimates put the cost to store a GB of data for a month at $1,800 for hot data. Compare that to the starting cost to store data in AWS’s S3 for $0.023 (essentially cold storage).

Of course, there’s a lot of value added to the data stored in observability platforms, such as compute and storage resources required to build indexes that make that data searchable, but understanding the costs doesn’t change the fact that storing data in these platforms is expensive. According to Honeycomb and other sources, companies on average spend an astounding 20 to 30 percent of their overall cloud budget on observability.

The solution Splunk and others adopted to help manage these massive costs – and the crux of the problem for Cisco’s AI ambitions – is an aggressive retention policy that keeps only thirty to ninety days of data in hot storage. After that, data can be deleted or, optionally, moved to the cold tier from which, according to Splunk’s own documentation, it takes 24 hours to restore.

A New Model is Needed

Observability and SIEM are here to stay. The service that platforms like Splunk provide is valuable enough for companies to dedicate a significant percentage of their budget to provisioning it. But the costs to deliver these services today will impede the products they deliver tomorrow if the fundamental economics of hot data storage isn’t overturned. Hot storage costs need to be much closer to raw object storage to serve the AI ambitions of companies like Cisco, Dell, and HPE. Architectures are emerging that decouple storage, allowing compute and storage to scale independently, and index that data so that it can be searched quickly. This provides solid-state drive-like query performance at near object storage prices.

The biggest hurdle may not be a strictly technical one, though. The incumbent observability and SIEM vendors must recognize that they have a significant economic barrier to executing on their AI-enabled product roadmap. Once they realize this, they can proceed to the solution: integrating next-gen data storage technologies optimized for machine-generated data into their underlying infrastructure. Only then can vendors like Cisco, HP, etc. transform the economics of big data and deliver on the promise of AI-enabled security and observability.

About the Author

Marty Kagan is the CEO and co-founder of Hydrolix, an Oregon-based maker of cloud data processing software. He was previously founder and CEO of Cedexis (acquired by Citrix) and held executive engineering positions at Akamai Technologies, Fastly, and Jive Software.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter:

Join us on LinkedIn:

Join us on Facebook:

Fri, 22 Dec 2023 21:00:00 -0600 Contributor en-US text/html
2023: The Year Generative AI Transformed Enterprise Data Management

As we transition from one year to the next, it's a season of reflection and looking forward. As an analyst, the end of the year is a time to learn from past work, analyze its outcomes and consider its potential impact on the future.

In 2023, enterprise data management (EDT) solutions underwent significant changes due to the influx of generative AI technologies. These technologies have fundamentally altered how businesses approach data management, analysis and usage. In this post, I’ll review some of 2023’s highlights in this field.

How Different Areas Of EDT Are Evolving

Over the past year, there have been promising developments in EDT across several key areas. These include data management itself, where the focus has been on using AI to Excellerate how data is organized and accessed. The data cloud sector has also experienced growth, with more businesses adopting cloud-based solutions because of their flexibility, scalability and facility for integrating tools that handle unstructured data.

In data protection and governance, there has been a continuous effort to enhance security measures to safeguard sensitive information. Database technologies have also improved, particularly in handling and processing large data volumes more efficiently by incorporating generative AI.

Recent advancements in data integration and intelligent platforms have been geared towards better aggregating data from multiple sources, allowing for more comprehensive data analysis. The integration of AI and ML has further enhanced the capabilities of these platforms, improving data analysis interpretation and offering more profound and insightful analytical outcomes.

Full disclosure: Amazon Web Services, Cisco Systems, Cloudera, Cohesity, Commvault, Google Cloud, IBM, LogicMonitor, Microsoft, MongoDB, Oracle, Rubrik, Salesforce, Software AG, Splunk, and Veeam are clients of Moor Insights & Strategy, but this article reflects my independent viewpoint, and no one at any client company has been given editorial input on this piece.

Bringing AI To Data Management—And Vice Versa

“In a way, this AI revolution is actually a data revolution,” Salesforce cofounder and CTO Parker Harris said during his part of this year’s Dreamforce keynote, “because the AI revolution wouldn't exist without the power of all that data.” Harris's statement emphasizes the vital role of data in businesses and points to the increasing necessity for effective data management strategies in 2024.

As data becomes more central, the demand for scalable and secure EDT solutions is rising. My exact series of articles focusing on EDT began with an introductory piece outlining its fundamental aspects and implications for business operations. This was followed by a more in-depth exploration of EDT, particularly highlighting how it can benefit businesses in data utilization. These articles elaborated on the practical uses and benefits of EDT and its importance in guiding the strategies and operations of modern businesses.

As businesses continue to leverage generative AI for deeper insights, the greater accessibility of data is set to revolutionize how they manage information. This development means enterprises can now utilize data that was previously inaccessible—a move that highlights the importance of data integration for both business operations and strategic decision-making. For instance, untapped social media data could offer valuable customer sentiment insights, while neglected sensor data from manufacturing processes might reveal efficiency improvements. In both cases, not using this data equates to a missed opportunity to use an asset, similar to unsold inventory that takes up space and resources without providing any return.

Revolutionizing Data Cloud Platforms

Incorporating AI into data cloud platforms has revolutionized processing and analyzing data. These AI models can handle vast datasets more efficiently, extracting previously unattainable insights due to the limitations of traditional data analysis methods.

Over the year, my own collaborations with multiple companies suggested the range of technological progressions. As I highlighted in a few of my articles, Google notably improved its data cloud platform and focused on generative AI with projects including Gemini, Duet AI and Vertex AI, reflecting its solid commitment to AI innovation. Salesforce introduced the Einstein 1 Platform and later expanded its offerings with the Data Cloud Vector Database, providing users with access to their unstructured enterprise data, thus broadening the scope of their data intelligence. IBM also launched watsonx, a platform dedicated to AI development and data management. These moves from major tech firms reflect a trend towards advanced AI applications and more sophisticated data management solutions.

At the AWS re:Invent conference, I observed several notable launches. Amazon Q is a new AI assistant designed for business customization. Amazon DataZone was enhanced with AI features to Excellerate the handling of organizational data. The AWS Supply Chain service received updates to help with forecasting, inventory management and provider communications. Amazon Bedrock, released earlier in the year, now includes access to advanced AI models from leading AI companies. A new storage class, Amazon S3 Express One Zone, was introduced for rapid data access needs. Additionally, Amazon Redshift received upgrades to Excellerate query performance. These developments reflect AWS's focus on integrating AI and optimizing data management and storage capabilities.

Recent articles have highlighted Microsoft's role in the AI renaissance, one focusing on the launch of Copilot as covered by my colleagues at Moor Insights & Strategy, and another analyzing the competitive dynamics in the AI industry. Additionally, Microsoft has expanded its data platform capabilities by integrating AI into Fabric, a comprehensive analytics solution. This suite includes a range of services including a data lake, data engineering and data integration, all conveniently centralized in one location. In collaboration, Oracle and Microsoft have partnered to make Oracle Database available on the Azure platform, showcasing a strategic move in cloud computing and database management.

Automating Data Protection And Governance

With the growing importance of data privacy and security, AI increasingly enables the automation of data governance, compliance and cybersecurity processes, reducing the need for manual oversight and intervention. This trend comes in response to the rise in incidents of data breaches and cyberattacks. AI-driven systems have become more proficient at monitoring data usage, ensuring adherence to legal standards and identifying potential security or compliance issues. This makes them a better option than traditional manual approaches for ensuring data safety and compliance.

Security is not only about protecting data but also about ensuring it can recover quickly from any disruptions, a quality known as data resilience. This resilience has become a key part of security strategies for forward-thinking businesses. Veeam emphasized “Radical Resilience” when it rolled out a new data protection initiative focused on better products, improved service and testing, continuous releases and greater accountability. Meanwhile, Rubrik introduced its security cloud, which focuses on data protection, threat analytics, security posture and cyber recovery. Cohesity, which specializes in AI-powered data security and management, is now offering features such as immutable backup snapshots and AI-driven threat detection; in 2023, it also unveiled a top-flight CEO advisory council to influence strategic decisions. Commvault has incorporated AI into its services, offering a new product that combines its SaaS and software data protection into one platform.

LogicMonitor upgraded its platform for monitoring and observability to include support for hybrid IT infrastructures. This enhancement allows for better monitoring across an organization's diverse IT environments. Additionally, Cisco has announced its intention to acquire Splunk. This acquisition will integrate Splunk's expertise in areas such as security information and event management, ransomware tools, industrial IoT vulnerability alerting, user behavior analytics and orchestration and digital experience monitoring that includes visibility into the performance of the underlying infrastructure.

Key Changes for Database Technology

Advancements in AI and ML integration are making database technology more intuitive and efficient. Oracle Database 23c features AI Vector Search, which simplifies interactions with data by using ML to identify similar objects in datasets. Oracle also introduced the Fusion Data Intelligence Platform, which combines data, analytics, AI models and apps to provide a comprehensive view of various business aspects. The platform also employs AI/ML models to automate tasks including data categorization, anomaly detection, predictive analytics for forecasting and customer segmentation, workflow optimization and robotic process automation.

In my previous discussion about IBM's partnership with AWS, a major highlight is the integration of Amazon Relational Database Service with IBM Db2. This collaboration brings a fully managed Db2 database engine to AWS's infrastructure, offering scalability and various storage options. The partnership between AWS and IBM will likely grow as the trend of companies forming more integrated and significant ecosystems continues.

Database technology also evolved with MongoDB queryable encryption features for continuous data content concealment. MongoDB Atlas Vector Search now also integrates with Amazon Bedrock, which enables developers to deploy generative AI applications on AWS more effectively. It’s also notable that Couchbase announced Capella iQ, which integrates generative AI technologies that exploit natural language processing to automatically create demo code, data sets and even unit tests. By doing this, the tool is streamlining the development process, enabling developers to focus more on high-level tasks rather than the nitty-gritty of code writing.

Leveraging Data Integration Platforms

Generative AI technologies have also improved data integration capabilities by using historical data, analyses of trends, customer behaviors and market dynamics. This advancement is particularly influential in the finance, retail and healthcare sectors, where predictive insights are critical for strategic and operational decisions. There's been a shift towards adopting data lake house architectures, which combine the features of data lakes and data warehouses to help meet the challenges of handling large, varied data types and formats, providing both scalability and efficient management. This evolution in data architecture caters to the growing complexity and volume of data in various industries.

Integrating various data sources is crucial for many companies to enhance their business operations. Software AG has introduced Super iPaaS, an evolution of the traditional integration platform as a service (iPaaS). This advanced platform is AI-enabled and designed to integrate hybrid environments, offering expansive integration capabilities. Cloudera has also made strides with new data management features that incorporate generative AI, enabling the use of unstructured data both on-premises and in cloud environments. Its hybrid approach effectively consolidates client data for better management. Informatica's intelligent data management cloud platform integrates AI and automation tools, streamlining the process of collecting, integrating, cleaning and analyzing data from diverse sources and formats. This creates an accessible data repository that benefits business intelligence and analytics.

That’s a Wrap!

In my collaborations throughout the year with various companies, one key theme has emerged in this AI-driven era – data has become even more fundamentally important for businesses. It's clear that the success of AI heavily relies on the quality of the data it uses, and AI models are effective only when the data they process is accurate, relevant and unbiased.

For example, in applications such as CRM or supply chain optimization, outcomes are directly influenced by the data’s integrity. Instances where AI failed to meet expectations could often be traced to poor data quality, whether it was incomplete, outdated or biased. This year has highlighted the necessity of not just collecting large amounts of data but ensuring its quality and relevance. Real-world experience underscores the need for strict data governance and the implementation of systems that guarantee data accuracy and fairness, all of which are essential for the effective use of AI in business.

As AI technology advances and data quality improves, the use of generative AI in understanding and engaging with customers is becoming ever more prominent. Backed by good data management, this enhances the customer experience by making the customer journey more personalized and informative. It allows businesses to gain valuable insights from customer interactions, helping them continuously refine and Excellerate their offerings and customer relations. I expect this trend to grow, further emphasizing the role of AI in customer engagement and shaping business strategies. In fact, this symbiotic relationship between AI-driven personalization and customer engagement is becoming a cornerstone of not only data management strategy but modern business strategy overall, significantly impacting how companies connect with their customers.

Wrapping up, it's evident that the emphasis on data quality is critical for improving AI's performance. Data management, cloud services, data protection and governance, databases, data integration and intelligent platforms have all significantly contributed to the advancement of AI. In 2024, I expect we’ll see even more emphasis on ensuring the accuracy and relevance of data so that AI can provide dependable insights.

Sun, 31 Dec 2023 09:37:00 -0600 Robert Kramer en text/html
As Sales Approach $1B, Splunk Makes Its Pitch To The Channel

Jim Kinney, president and CEO of Indianapolis-based solution provider Kinney Group, makes a bold observation about Splunk, the big data software developer that his company has partnered with for four years.

Splunk and its technology "sure has the feel of being on the front of something gargantuan," Kinney says. "This has the feel of VMware back in '05 or '06."

Splunk, founded in 2003, is hardly a startup. But the developer of operational intelligence software for instantly searching, monitoring and analyzing machine-generated data is getting more attention these days beyond its core IT operations and IT security customer base. Splunk's platform is finding its way into an increasingly broad range of business analytics and big data applications, and the company is positioned to be a key technology player in the nascent Internet of Things arena.

[Related: Splunk Expands Machine-Learning Capabilities Of Its Operational Intelligence Software]

It's also attracting more attention from solution providers as the company, after relying primarily on direct sales for the first decade-plus of its existence, has been ramping up its channel efforts in the last two years.

The channel should take notice. Splunk (whose name comes from the cave exploration term "spelunking") is closing in on $1 billion in annual revenue, having recorded 43 percent sales growth in the first half of fiscal 2017 to $398.7 million. Analysts have put the vendor's total potential market at $46 billion to $58 billion, and observers say the company's sales could hit $5 billion as soon as 2020.

The San Francisco-based company's customer base grew from approximately 10,000 as of July 31, 2015, to more than 12,000 on July 31 of this year, according to a exact filing with the U.S. Securities and Exchange Commission.

CEO Doug Merritt, speaking at Splunk's .conf2016 customer and partner event in Orlando late last month, said he thinks "at least half" of the company's sales should ultimately go through the channel.

"When I walked in we were [following] a more direct-centric model," said Merritt, who joined Splunk in May 2014 as senior vice president of field operations and was named the president and CEO in November 2015. "I came in the door jumping up and down about the channel, about partners in general. It felt like an opportunity for growth for us."

Merritt, both at .conf2016 and in an exclusive interview with CRN, acknowledged that Splunk was slow to leverage the channel. "Splunk has been difficult for people to understand," he said, and recruiting resellers is a challenge "when you're an early pioneer, and you're evangelizing a new [technology] category.

Splunk does not disclose what percentage of its sales go through the channel today or how many channel partners it works with. The exact SEC filing, for the company's second fiscal quarter ended July 31, said the company "expect[s] that sales through channel partners in all regions will continue to grow as a portion of our revenues for the foreseeable future."

Splunk's software was initially developed to collect and analyze operational log data from IT systems for system administration tasks. But Splunk and its more forward-thinking customers have come to realize the technology can be used to collect and analyze almost any kind of streaming real-time data, from IT operations and IT security systems, to data produced by machines on a factory floor, to sensors that make up an Internet of Things network. That positions Splunk to play a pivotal role in the burgeoning big data market.

Merritt and other Splunk executives make it clear that as the Splunk Enterprise flagship product evolves from a toolset for programmers into a data management platform with a broad range of use cases, the channel will play a critical role. The channel will provide both "feet on the street" for the sales scalability that wouldn't be possible with the vendor's direct sales force, and the vertical industry and domain expertise needed as Splunk's software is used for new applications.

Merritt, in a press briefing at .conf2016, said Splunk's growth depends on getting the platform into hundreds of thousands of accounts, "and the channel, in particular, is going to be incredibly important for us to get there."

"We look at how many [sales] people we can hire and train [for] carrying Splunk to our customers, versus how many people the channel has," Merritt said. "[If] we enable them properly, there is so much more capacity in the channel than there is [inside] Splunk."

Splunk CTO Snehal Antani, speaking at the same event, said partners would be especially critical as the use of Splunk's software grows beyond its core IT DevOps and IT security applications into broader business analytics and Internet of Things use cases.

"The channel and partners become really important in IoT and business analytics," said Antani, the former GE Capital CIO who was named CTO in May 2015. "You need to have retail domain expertise, or healthcare domain expertise, or financial domain expertise to really get the value out of that data. We've got the enabling technology, but [partners have] got the domain expertise.

"For us, getting the channel right is important for [sales] scale. But getting the channel right is especially important for us to move into other types of use cases that are much more domain-specific," Antani said.

So Splunk understands its need for the channel. What does the channel say?

Trace3, an Irvine, California-based solution provider focused on big data and cloud technologies, has worked with Splunk for five years and built a Splunk practice that generated $7 million in revenue in 2014 and $14 million last year. The company, an Elite level partner, was Splunk's 2015 North American "Partner of the Year."

"I think Splunk is certainly still learning and developing themselves as a channel company," said John Ansett, Trace3's director of operational intelligence, of the company's Spunk relationship. Four or five years ago "they were very much a direct company" with some conflict with partners at the sales level, he said. "That's absolutely changed in the last 18 to 24 months."

"Now I see them using the channel and leveraging the partners a lot more than in the past. They recognize that their ability to scale is going to be through the channel and for them to get there they recognize that partners are really the way to get there," Ansett said.

Other partners also paint a portrait of a company in transition. "Are there bumps in the road as they take on more of a channel-oriented model? Sure," said Jim Kinney. "This is a company of fantastic people that are just incredibly passionate about what they are doing. And they treat their partners really, really well. That has meant the world to us."

"From a listening standpoint and ability to work with, they are as good as any vendor partnership I've had," said Jeff Swann, director of solutions architecture at OnX Enterprise Solutions, a Splunk Elite partner and North American solution provider headquartered in Toronto and New York. "They're very interested in working with their partners," said Swann, who works in OnX's Mayfield, Ohio, office and manages OnX's relationship with Splunk and sits on the company's partner advisory council.

Partners generally provide good – but not great – grades for the nuts and bolts of its Partner+ channel program. Swann says the partner portal and other tools are "very good" and the marketing materials and content are "very easy to use and modify."

Kinney said he'd like to see more dedicated resources to help partners hire and train more engineers with Splunk expertise for development and customer support. Trace3's Ansett said the partner program lags other vendors in such areas as rebates and revenue-commit offerings.

Splunk's Merritt, at the press conference, pointed to the partner portal and deal registration systems the company assembled and the channel neutrality policy put in place last year as signs of progress, but he acknowledged that those steps are just a start.

The company's channel efforts may have suffered a setback in February when Emilio Umeoka, vice president of global alliances and channels, left to become head of education sales at Apple.

In March the company hired Susan St. Ledger, Salesforce's chief revenue officer, as Splunk's new CRO, overseeing all revenue generating and customer facing operations. In July, Splunk hired Cheryln Chin, a senior vice president at Good Technology, to replace Umeoka as vice president of global partners. Aldo Dossola is area vice president of North America partner sales, reporting to Chin.

In April Splunk hired Brooke Cunningham, a highly respected channel marketing executive with business analytics software developer Qlik, as area vice president of worldwide partner programs and operations.

"I saw an opportunity to come and really help define that partner experience," Cunningham said in an interview before .conf2016, noting that she has the job of taking Splunk's partner program to the next level. "We're really diving into how we continue to mature the Partner+ program," she said, specifically citing "investments in infrastructure" that are in the works for the partner portal and other support systems.

At .conf2016 Splunk announced a new licensing initiative that, starting Nov. 1, will provide free licenses for test and development purposes. Partners said that move would make it easier for partners to help customers expand their use of Splunk by giving them more opportunities to experiment with the software.

Swann pointed to Merritt's plans to expand education and training opportunities for partners – including free online training – and efforts to grow the number of Splunk-certified developers and engineers as promising moves to expand the overall ecosystem.

"They're doing all the right things," said Ansett at Trace3. "They're putting in the right resources [and] they have the right leadership in place. And I'm starting to see them go 'partner-first.'"

But it's the potential of Splunk's software that really gets partners excited.

"Security is certainly the biggest growth area," said Ansett, although IT operations applications now account for the biggest part of Trace3's Spunk-related revenue. Sales for Internet of Things applications are small, he said, but growing.

Splunk is key to OnX's security intelligence, operational analytics and DevOps practices. Swann said a successful strategy is getting Splunk into a customer for a specific application, then expanding the sales to other areas once the customer understands Splunk's capabilities.

"For our organization, it makes us more sticky," he said. "Once we get in, we find lots of other use cases."

Splunk is playing an increasingly important role in two of Kinney Group's core practices: analytics and next-generation data centers. Splunk is now the primary platform for its business analytics services, as with a predictive analytics project Kinney recently developed for a medical equipment management company to better anticipate equipment failures, said Laura Vetter, Kinney's vice president of analytics. Splunk's software was also a component of a major PCI (Payment Card Industry) data security project Kinney developed for a leading IT hosting provider.

Last month Splunk debuted Splunk Enterprise 6.5 with expanded machine learning technology and new features that improved its advanced analytics capabilities. New integrations with Hadoop and simpler data preparation tools helped reduce the product's total cost of ownership – a significant point according to one partner who told CRN that the market perceives Splunk's software to be expensive.

As to his case of VMware déjà vu, Jim Kinney says that in VMware's early days top managers at businesses that implemented the vendor's virtualization software didn't initially grasp the technology's potential. Once they did, VMware sales exploded. Kinney thinks Splunk is reaching the same tipping point as awareness of what the company's software can do expands beyond the data center.

"Our company has made a pretty significant financial wager [on Splunk]," he said, "and it absolutely has paid off and is providing returns."

Thu, 20 Oct 2016 08:33:00 -0500 text/html
Splunk Unveils New AI Offerings And Edge Hub, Strikes ‘Digital Resilience’ Alliance With Microsoft

At the Splunk .conf23 event Tuesday, Splunk expanded the SecOps and ITOps functionality of its flagship unified security and observability platform and debuted a collection of AI-powered tools to boost the system’s detection, investigation and response capabilities.


“Digital resilience” is the key theme at Splunk’s .conf23 this week, and is the underlying focus of several new technology unveilings at the event including the new Splunk AI, the new Splunk Edge Hub operational technology, and a series of innovations around the flagship Splunk platform.

Digital resilience is also the goal behind a new Splunk-Microsoft partnership, also announced this week, through which the two companies will build Splunk’s enterprise security and observability software on the Azure cloud platform.

And for the first time Splunk’s products, including Splunk Enterprise, Splunk Enterprise Security and Splunk IT Service Intelligence, will be available for purchase through the Microsoft Azure Marketplace, the companies said.

[Related: Splunk Hires Microsoft Exec Gretchen O’Hara As Its New Channel Chief]

“We’re super excited about [the] new capabilities, including AI capabilities, that we think will be impactful not only to our customers but also to the partner community,” Splunk president and CEO Gary Steele said in a pre-.conf23 interview with CRN.

Digital resilience, the core mission for Splunk’s unified security and observability platform, is the protection of digital workflows and workloads – often part of larger digital transformation initiatives – from cyberattack and maintaining the performance of those processes and the IT that supports them.

“This digital resilience message resonates broadly, and most customers want to talk about it,” Steele said in the interview. “Many customers need some form of assistance that would come from partners. These are big initiatives in companies today. I think the resilience side of things is actually a very high priority. It comes in the form of improving security posture, it comes in the form of application uptime, application visibility, what’s really happening.

The new Splunk AI is a collection of AI-powered software that will enhance the functionality and use of the core Splunk platform. Splunk AI Assistant, the first product in the Splunk AI set, will provide a natural language interface to the Splunk system that provides a “chat” experience and can be used to explain or author Splunk Processing Language queries.

Splunk AI Assistant, now in preview, will make it easier for users to engage with the Splunk system and search for data using natural language, said Min Wang, Splunk CTO, during a .conf23 keynote Tuesday.

“We all know AI is rapidly transforming our industry and opening up new opportunities,” said Wang, who just joined Splunk in April. “As an expert in security and observability, we have the best domain-specific insight derived from real-world experience. With these insights we can build the best AI capabilities that are fine-tuned for security and observability and tightly integrated with Splunk.”

Wang said that going forward Splunk will embed Splunk AI Assistant into other workflows, such as security detection and investigation.

AI is also a component of the new Splunk App for Anomaly Detection, used by SecOps, ITOps and engineering teams, with AI-assisted workflow to simplify and automate anomaly detection within an environment.

The new release of Splunk Machine Learning Toolkit (MLTK), 5.4, builds on Splunk AI with an ability to bring externally trained models into Splunk. And Splunk App for Data Science and Deep Learning (DSDL) 5.1 includes two AI assistants that allow customers to leverage large language models to build and train models with domain-specific data to support natural language processing.

“The launch of Splunk AI really reaffirms what is a history and a commitment we have around innovation in search and analysis of large volumes of data,” said Tom Casey, Splunk senior vice president of products and technology, in a pre-.conf23 briefing. “And our approach is to bake intelligent AI assistants into the everyday tasks our users are performing. We think Splunk is a trusted partner for mission-critical workloads.”

Of particular interest to the channel is the debut of Splunk Edge Hub, a hardware and software device that the company says simplifies the ingestion and analysis of data generated by operational technology including sensors, IoT devices and industrial equipment. Edge Hub provides more complete visibility across IT and OT environments in such industries as manufacturing and energy management by streaming previously hard to access data directly into the Splunk platform.

“Splunk Edge Hub is really pretty groundbreaking,” Casey said. “It breaks down barriers and silos that historically made it difficult to extract and integrate data from your operating environment. And with some new abilities that it provides, it’s much easier to access that data, integrate it and gain visibility to it in a common way using the normal Splunk tools and dashboards that people have in their environments already.”

The Splunk Edge Hub hardware will be sold and supported exclusively through Splunk channel partners, Casey said. Edge Hub includes a Splunk license and partners can develop vertical industry solutions that incorporate Edge Hub and even build custom solutions for customers.

“We think this is a great opportunity for the experts in our partner community, like Accenture and others, to go out and build more technology and more resilience into the practices that they have around energy, manufacturing, et cetera,” Casey said.

Splunk has been working with and training some partners over the last year in advance of the Edge Hub launch including Accenture and Grey Matter. “What we’re really trying to do is enable an ecosystem here to be super effective with the Edge Hub device,” the executive added. Edge Hub is generally available in the U.S. with plans to extend availability to EMEA and APAC at a later date.

The company also unveiled a slew of new capabilities and enhancements, most geared toward SecOps, ITOps and engineering teams, either within the Splunk Cloud Platform and Splunk Enterprise 9.1 or provided as add-on software.

The new Splunk Attack Analyzer helps security teams automate the analysis of malware and phishing attacks to identify complex attack techniques intended to evade detection. OpenTelemetry Collector is a technical add-on to help Splunk Observability Cloud users capture metric and trace data. And the new Unified Identity offering allows ITOps personnel and engineers to access Splunk Cloud Platform and Splunk Observability Cloud with one user identity.

Tue, 18 Jul 2023 09:43:00 -0500 text/html
Wall Street is gearing up for an AI shopping spree. Meet 11 bankers poised to come out on top.

Alan Bressers and Brandon Hightower, the founders of Axom Partners

Alan Bressers, left, and Brandon Hightower, the founders of Axom Partners.
Axom Partners

Bresser's relevant deal experience: Spacemaker's $240 million sale to Autodesk, NXP's proposed $47 billion sale to Qualcomm, and Linear Technology's $15 billion sale to Analog Devices.

Hightower's relevant deal experience: Magento's $1.64 billion sale to Adobe; Afterpay's $27.96 billion sale to Square, now known as Block; and Intuit's $7.1 billion acquisition of Credit Karma.

Axom Partners is one of the latest tech-centered M&A advisory firms on the Street and appears to be the first to make AI-related dealmaking the fulcrum of its business. Three alumni from the tech-advisory firm Qatalyst Partners — Bressers, Hightower, and the attorney Ross Weiner — started the firm in September and are betting on AI as a total game changer. They even used the chatbot Bard to help them come up with the name "Axom."

Bressers and Hightower told BI that Axom will focus on earlier-stage companies that rivals may view as too small, not just companies in the artificial-intelligence sector.

"We saw a chance in starting Axom Partners to service clients at transaction sizes that Qatalyst has outgrown," Hightower told BI. "We're nimble and we're building our brand to service the most innovative companies as they scale."

"We don't want to just be software bankers and AI as a part of software. We want to be able to say we understand AI down to the chip level, up to the application level, and even the impact on consumer users," Bressers said.

Bressers grew up in Pittsburgh, where his summer job in high school was giving tours of the USS Requin, a World War II-era submarine parked in the Ohio River. At 6'2", he would've been disqualified from serving on the vessel and said he had enough bumps on the head to understand why.

He graduated from Wharton and the University of Pennsylvania, where he studied economics and engineering. He started his career at Credit Suisse's San Francisco office in 2006 before joining Qatalyst in 2009. He focused on semiconductor companies there, which he says conveniently dovetailed into covering AI. He left Qatalyst in 2021. He lives in San Francisco with his wife and their 3-year-old son.

He said he's seeing companies quietly readying themselves to come out on top, including by teaming up with the companies they see as winners down the road, such as Microsoft and ChatGPT.

"They're making alliances today in order to build their companies," he said. "Those alliances will have some long-lasting impacts on them from an M&A perspective."

Hightower studied business and finance at Brigham Young University. He worked at the boutique advisory firm GCA Savvian Corporation before joining Qatalyst in 2014, where he covered consumer internet, fintech, and software companies before leaving to start Axom earlier this year.

His Christmas wish list for the next wave of AI includes an AI-supported way for marketers to reach customers better.

"If I were an Adobe or any of the marketing cloud-related companies, what I'd want under the tree is an AI solution that takes all of my customer data and helps me to really target my customers in a more automated and more cost-effective way so that I'm not spamming the wrong people and spending aimlessly on user acquisition," he said.

Hightower and his wife, Laura, welcomed their fifth child this past summer. When the big moment arrived, the couple tried to head out the door for the hospital, but the baby had other plans.

"My wife said the baby was coming, and I said yes, let's get to the hospital. I didn't realize she meant right now! You don't have time to think in those situations, so I just played catcher, and we delivered our little baby girl right there in our hallway, with my wife standing upright, only 15 minutes from the first contraction."

Hightower used zip ties and kitchen shears to cut the umbilical cord. Luckily, his wife is a nurse, so they never went to the hospital after checking in with the doctor by phone.

Bank of America's Neil Kell

Neil Kell, the chair and global head of TMT equity-capital markets at Bank of America.
Bank of America

Relevant deal experience: $500 million sale of Dynatrace stock for Thoma Bravo, Arm's IPO, Mobileye's IPO, and Intel's $1.62 million sale of its stake in Mobileye.

As the chair and global head of TMT — or technology, media, and telecom — equity-capital markets for Bank of America, Kell decides which clients need more capital to develop AI technologies and which companies would benefit from buying new IP or products.

He anticipates the AI craze to kick off a wave of fundraising activity for banks as companies across various industries — such as healthcare, aerospace, defense, and manufacturing — seek to build or acquire AI solutions.

"I do think there's going to be some very tangible M&A that's going to evolve because of this — pretty sizable and significant pick-up in capital formation. And it's all within this concept of, 'We've spent 20 years in various forms of artificial intelligence and technology. How do we monetize it now?'" Kell told BI. "That's where the Street is looking. It's all beginning to converge, so it's a pretty dynamic period."

Kell, who has been at Bank of America for 25 years, has helped clients around the world raise more than $150 billion via public and private equity financing. When not in his Palo Alto office raising money for clients, Kell likes to play the bagpipes, an instrument he's played since childhood, after his doctor suggested picking up a wind instrument to help recover from a lung injury. Kell has traveled all over the world playing the bagpipes, but one of his favorite places to play is Scotland, where he played in different competitions as a teenager.

Kell says Bank of America has dedicated bankers positioned "where there is a nexus of AI development," such as Silicon Valley and budding markets such as Austin and Denver. But he expects the bank's AI coverage to take many shapes over the coming years as its teams adapt to the constantly evolving landscape. And he expects the bank to keep investing in AI banking.

"This is something that's here to stay and something that's likely to grow and become a large and tangible part of our business," Kell said.

Citi's Sirisha Kadamalakalva

Sirisha Kadamalakalva, the head of artificial-intelligence investment banking at Citi.

Relevant deal experience: Clients include Alteryx, Cloudera, Confluent, Coveo,, Elastic, Introhive, Klaviyo, and MuleSoft.

Kadamalakalva joined Citi in January to head the bank's artificial-intelligence investment banking efforts — and it's been a whirlwind ever since.

The sector is advancing so fast it's "dizzying," she said in an interview. She equated one year in AI to five to 10 years in other sectors.

"We are so early in the cycle that evaluating winners and losers is almost an everyday exercise," she said.

Because AI impacts other sectors and subsectors within technology, Kadamalakalva often works with bankers without tech experience to bring her expertise to their respective industries.

Before joining Citi, she worked as a software-investment-banking unit leader at Bank of America for nearly a dozen years. But she says working with AI companies is far different from working with traditional software.

"There's a lot of strategic challenges that these companies deal with," she said. "It's not just about being a banker, but a strategic partner."

To that end, Kadamalakalva tries to build relationships with budding AI companies — some of which will hopefully become her clients and mint her millions of dollars — early on.

Looking to 2024, Kadamalakalva said tangential technologies that directly impact AI's growth — or, put another way, the ingredients that add up to make AI possible — will shape AI dealmaking. That's why Kadamalakalva is monitoring tech trends such as the shortage of GPUs, which are special and expensive chips critical to training AI models.

Goldman Sachs' Jung Min

Jung Min, a partner at Goldman Sachs.
Goldman Sachs

Goldman Sachs' Jung Min

Relevant deal experience: Microsoft's $69 billion acquisition of Activision Blizzard; McAfee's $14 billion sale to investors who took it private; and Intuit's $8.1 billion acquisition of Credit Karma.

To best navigate AI M&A in 2024, bankers will need to be more like they were in 1990, Min, the co-COO of Goldman Sachs' technology, media, and telecom division, said.

"In the 90s, people would've said, 'I'm a tech banker.' They would not have said, 'I'm a software banker,' or 'I'm a semiconductor banker.' But we kind of need to go back to doing that," Min told BI.

That's because AI touches so many different layers of the tech stack, or the infrastructure that makes it possible to develop and run AI applications. That means bankers specializing in software companies are melding more with semiconductor bankers. And other bankers, including Min, are working to cut across the different layers and advise on all aspects of tech.

For his part, Min covers the hyperscalers, or the large cloud companies that have made names for themselves developing front-end software services. But the same cloud companies have also started developing their own GPU chips, which are in short supply but are essential for training the big models behind the latest AI tools. Min also covers large semiconductor companies building GPUs "because I need to know what's going on across the different stacks or different layers of the tech stack," he said.

That approach appears to work for Goldman Sachs, which played a critical role in Microsoft's acquisition of Nuance Communications in April 2021 for $19.7 billion, S&P Global Market Intelligence data shows.

It's still early in the AI M&A cycle, but Min sees AI deal activity accelerating as more companies realize their AI shortcomings. That could lead to tech companies looking to acquire one specific layer of the tech stack — such as data analytics — needing to acquire other parts, such as data storage or computing.

Min also expects to see non-tech companies with strong customer bases and lots of data get into the AI game. "They're going out to buy the tech companies that do have those AI capabilities, so that they can create those products and they can monetize the value," Min said.

JPMorgan's Madhu Namburi

Madhu Namburi, the global head of technology investment banking at JPMorgan.

Relevant deal experience: $69 billion sale of VMware to Broadcom; Qualtrics' $12.5 billion sale to Silver Lake, who took it private; $18.5 billion sale of Worldpay to GTCR.

A JPMorgan spokesperson said that Namburi sets the overall strategy for JPMorgan's technology practice. That includes capital-allocation decisions and client prioritization for corporate clients across software, fintech, and other tech sectors.

According to his LinkedIn, Namburi, who declined to be interviewed for this list, has a bachelor's degree in mechanical engineering from Delhi College of Engineering. He joined JPMorgan in 2000 after earning his MBA from Pennsylvania State University.

Last year, Namburi spoke optimistically about the growth of the tech sector in a video JPMorgan posted to its Facebook page. He said that right now, five of the largest companies in the world are tech companies. In 10 years, he expects that number to get bigger, not smaller.

"This rate of value creation is going to only continue," he said in the video.

When it comes to AI and data, the tech world is "only just scratching the surface" in terms of capability and growth, he said in a separate video posted to X.

The bank said he has experience in M&A, growth IPOs, LBOs, debt and equity, and equity-linked financing across large and smaller growth-oriented technology companies.

He also manages a tech venture-investment fund that has invested in various emerging companies within the technology sector focused on AI, data analytics, enterprise infrastructure, and disruptive business models, JPMorgan said in an email.

After spending 15 years in JPMorgan's New York office, he moved to the San Francisco Bay Area about eight years ago. The firm said he lives in Hillsborough with his wife Radhika, their 18-year-old daughter Ila, and their 15-year-old son Niam.

Lazard's John Gnuse

John Gnuse, the managing director at Lazard.

Relevant deal experience: Google's $2.6 billion acquisition of the data-analytics business Looker in 2020; Intel's $2 billion acquisition of the AI-chip company Habana Labs in 2019.

Gnuse is a Lazard lifer — he joined the firm in 1992 as an investment-banking analyst and has been with the company ever since. The only time he spent away was in the mid-90s for his two-year grad-school program at the University of Cambridge, where he studied history and philosophy.

Gnuse has covered the tech sector at Lazard through decades of changes, starting in New York and then in San Francisco, where he moved in 1999 during the dot-com bubble. In exact years, his team has represented large-cap technology companies, including Google, IBM, and Intel, at the center of the AI conversation.

He said he thinks the winners of the AI frenzy will be the companies that integrate AI capabilities into existing tools and workflows. "Application providers that can embrace these capabilities quickly and provide incremental value to their end users have the potential to capture a big part of the value," he said. "And ones that don't could face growing questions from investors about the threats of AI to their core business."

Gnuse said he's seeing a lot of activity in application areas where generative-AI capabilities have already proven relevant, including code development and customer support, and in fields such as law and education, where paperwork is necessary but tedious.

"These are areas where outputs often follow a very standard form or syntax, and you can train models to mimic that pattern," he said — companies with a relevant user base in these domains could be where we see early activity.

Gnuse is no stranger to academia — he majored in physics and philosophy at Yale as an undergrad, has a master's degree in history from the University of Cambridge, and an MBA from INSEAD Business School — and says it's important to be in a sector you love to study. His career advice to the next generation of dealmakers is to align themselves to a sector or subsector they're passionate about like he did with technology.

"Thirty years ago, many top bankers could afford to be generalists and somewhat sector agnostic, but today, clients really value advisors who intimately understand the nuances of their business and sector-specific strategic issues," he said. "Find a segment that you love to research and study. I'm blessed that I get to work in tech because there are fascinating developments every day."

Morgan Stanley's David Chen

Dave Chen, the head of global technology investment banking at Morgan Stanley.
Morgan Stanley

Relevant deal experience: IPOs for Barracuda Networks,, and Salesforce; $28 billion sale of Splunk to Cisco; sale of Archer Technologies to Cinven; Thales' $3.6 billion acquisition of Imperva from Thoma Bravo.

Chen started his career at UBS Wealth Management in 1998 but has been at Morgan Stanley since 2002. He traces his interest in the sector back to his college days.

"From the time that I was an undergrad at Stanford, I have been hooked on technology," Chen told BI via email.

That early interest has held strong through his 24-year career. Now at the helm of Morgan Stanley's technology-investment-banking practice in Menlo Park, California, Chen has worked on some of the most formative deals in the AI space yet — including as an advisor to Splunk in its historic $26 billion sale to Cisco.

"I have had a front-row seat to the transformation of the technology industry while advising industry stalwarts over the years. The dynamism and innovation of tech have kept me deeply interested over the years, and I anticipate this to continue to be the case for many more to come."

If there's a road map for future deals we can glean from the historic Splunk acquisition, it's that 2024 deals will value two things: data and talent.

"AI makes the data a company has even more powerful," he said. "So far, most are looking to take data and talent and then create an in-house solution or partner with other providers instead of acquiring AI-based products directly."

The way he sees it, M&A in the AI space has been muted compared to the surge in interest for several reasons. First, valuations are high. Second, large companies under pressure to move quickly are considering other solutions, including building their tech in-house or partnering with a large language model provider. He expects that trend to continue before unleashing a torrent of activity.

"I'm expecting a slow burn in the first half of the year with companies still determining their AI product strategy and then a boom of activity as companies gain conviction in their plans and as macro pressures are reduced," he said. "Acquisitions will mostly be about accelerating product roadmaps, and I think 2025 would be the year that we see some very exciting IPOs in this sector."

AI will not only influence deals, he said, but also impact the way tech-banker bosses, including Chen, run their teams.

"AI copilots and AI research bots will infuse everything we do over time," he said. "From creating presentation materials to helping our bankers conduct market research, this will be more transformative to our work than search engines have been."

Qatalyst's Rob Chisholm

Rob Chisholm, a partner at Qatalyst Parnters.
Qatalyst Parnters

Relevant deal experience: Tableau's $15.7 billion sale to Salesforce; Qualtrics' $12.5 billion sale to Silver Lake and CPP Investments; Zendesk's $10.2 billion sale to Hellman & Friedman and Permira.

Chisholm thinks of AI as a total game changer for the industry, the likes of which we haven't seen since Apple rolled out its groundbreaking smartphone.

"People talk about the 'iPhone moment' for gen AI. There really is this sort of pivot moment in history that we're all going to look back on and see as the moment when our world changed in, frankly, very unpredictable ways, but mostly in exciting ways."

Chisholm, who joined Qatalyst in September, helps lead the enterprise-software group and spearheads the firm's AI efforts. He says the speed and disruptiveness of AI have made liaising between companies a bigger part of his job.

"Almost overnight, all of these people — both the disruptive people starting new companies and the established, successful large software and broader technology companies — they all wanted to be talking to each other," he said.

He was previously a partner at Goldman Sachs, where he spent five years on its TMT investment-banking team.

He's the first to admit that his Wall Street story is unusual. Hailing from a small town in Nova Scotia, he went to Princeton on a hockey scholarship. Still, he left in favor of a small liberal-arts school in Vermont, Middlebury College, to study environmental policy. At the time, everyone told him he was making a mistake.

After graduating, Chisholm worked at an environmental nonprofit in Boston, but after finding the work "a little bit lower intensity than suited my personality," he took a job at the investment-banking boutique AGC Partners, later working at Deutsche Bank and then Citi before ultimately ending up at Goldman Sachs in 2018.

He clearly remembers when he knew Qatalyst would be the right move during a conversation with the firm's famous founder.

"When I interviewed with Frank Quattrone, I asked him, 'What will dictate whether I'm successful at Qatalyst or not?' And he said, 'Come to Qatalyst if you want to be an active participant in how the technology industry changes and not just a passive agent of what is happening around you.' I almost jumped out of my chair when he said that to me."

"Of all the things that investment bankers do, M&A is by far the most interesting to me," he said. "And that is everything we do at Qatalyst. You can wake up on a Monday morning, and the world will be different than it was Sunday night when that deal you were working on for six months or a year gets announced."

Tidal Partners' David Handler and David Neequaye

David Handler, left, and David Neequaye, the cofounders of Tidal Partners.
Tidal Partners

Handler's relevant deal experience: Cisco Systems' $28 billion acquisition of Splunk, ServiceNow's acquisition of G2K, and Bloom Energy on convertible notes offerings.

Neequaye's relevant deal experience: Mixpanel's $200 million in Series C funding from Bain Capital; Motorola Inc.'s $9 billion spinoff of its Mobility & Connected Home businesses; Motorola Mobility on its $12 billion sale to Google.

Starting a new business is risky, especially in the stodgy world of investment banking. But Handler and Neequaye knew they could leverage their years of dealmaking experience.

"The relationships and trust we've built over the last two decades and our proven track record of industry-defining transactions serve to differentiate the value Tidal will bring to our clients," Handler said in an August 2022 release announcing the launch of Tidal Partners.

Despite being one of the latest M&A firms in the space, Tidal Partners is already making a name for itself in the AI landscape, advising on Cisco System's $28 billion acquisition of Splunk and ServiceNow's acquisition of the AI platform G2K.

"We've known David (Handler) and his partner David (Neequaye) for a very long time," Chuck Robbins, the CEO of Cisco, told Reuters in September after announcing the pending acquisition of Splunk. "They did a great job for us."

Handler and Neequaye, who declined to be interviewed for this story, have a long history together. Before Tidal, they were founding members of the technology practice at Centerview Partners, where they worked for 14 years, according to their LinkedIn profiles.

Following his 2022 departure, Handler sued Centerview over a pay dispute. Handler and Neequaye also overlapped at UBS, where Handler was the cohead of technology investment banking, and Neequaye was the director of technology investment banking, according to their LinkedIn profiles. Both men also worked at Bear Stearns in the early 2000s.

In launching Tidal, Handler said: "The tech landscape is more dynamic and evolving faster than ever. We see an opportunity for a trusted strategic partner who will help clients connect dots and move with greater agility and creativity."

Mon, 25 Dec 2023 20:00:00 -0600 en-US text/html
Top 5 Cybersecurity Mergers and Acquisitions 2023

Cybersecurity is a rapidly growing market, and it is projected to surge in value globally from $153.6bn in 2022 to $424.9bn by 2030, according to Fortune Business Insights.

Yet this sector has not been immune from the global economic downturn, resulting in budget cuts for security teams and layoffs of cybersecurity staff in the past year.

This economic environment has impacted mergers & acquisitions (M&A) deals in the industry. Stuart Pilgrim, Head of Cybersecurity M&A at KPMG UK told Infosecurity that technology acquisitions, including in cybersecurity, were down in both volume and value in 2023 compared to exact years.

“Currently, deal activity is limited as the market is very risk-averse, and with cybersecurity assets trading highly, combined with the growing cost of capital, concerns around value are making investors cautious,” he said.

This ‘safety-first’ approach has also been observed by Susan Sharawi, Cyber Security Partner at Deloitte, who noted that M&A deals this year have been dominated by established cybersecurity companies, rather than start-ups.

Potential for an M&A Boom?

Cybersecurity is still a relatively young sector, and as it continues to mature, there is set to be significant M&A activity on the horizon.

Pilgrim noted that relatively few cybersecurity companies currently offer international support. Additionally, the UK market is in particular has become very fragmented, with hundreds of providers offering similar services. While many potential targets may not be ready for M&A today, he believes this means the cyber market is on course for more market consolidation.

“The combination of these factors will encourage cybersecurity firms to merge so they can meet market demands,” outlined Pilgrim.

He added that cybersecurity company valuations are starting to plateau, further ripening the market for investment.

Sharawi agreed that the cybersecurity industry has been saturated, and that M&A activity will be needed to consolidate such a busy market.

“For cybersecurity companies the focus is on consolidation that provides a more streamlined and comprehensive offering to their customers,” she explained.

Top 5 M&A Deals in 2023

While there has been a relative slowdown in M&A activity in cybersecurity this year, several eye-watering deals were announced involving big-name players in the field. Here are Infosecurity Magazine’s top five M&A deals for 2023:

1. Cisco Agrees Record Deal to Acquire Splunk

Digital communications giant Cisco announced a deal to buy cybersecurity and observability firm Splunk for a $28bn fee on September 21. The transaction, which will be the biggest in Cisco’s history, is expected to close by the end of the third quarter of calendar year 2024. It represents the latest in a line of exact cybersecurity acquisition deals by Cisco, with the company also in the process of taking over Armorblox, Oort and Lightspin.

2. Thales Agrees $3.6bn Deal for Imperva

In a major transaction announced on July 25, French aerospace and defense firm Thales agreed to purchase US cybersecurity company Imperva from investment giant Thoma Bravo for $3.6bn. The move highlights how cybersecurity has become a priority market for Thales. The deal is expected to complete at the beginning of 2024, upon completion of anti-trust and regulatory approvals.

3. Thoma Bravo Completes Acquisition of ForgeRock

Private equity giant Thoma Bravo announced the completion of its $2.3bn deal for identity and access management company ForgeRock on August 23. Thoma Bravo also revealed that it has combined ForgeRock into its portfolio company Ping Identity, which it acquired in October 2022.

4. Proofpoint Completes Acquisition of Tessian

Proofpoint, a Thoma Bravo subsidiary and enterprise security provider, announced on October 30 that it had acquired Tessian for an undisclosed fee. Tessian, a UK-based cloud email security provider, has raised approximately $128m since launching in 2013 and was last valued at $500m after its Series C funding round.

5. CrowdStrike Agrees Deal to Acquire Bionic

On September 19, CrowdStrike announced it will be extending its Cloud Native Application Protection Platform (CNAPP) Application Security Posture Management (ASPM) through the purchase of Bionic. The proposed deal, thought to be worth around $350m, is expected to close during CrowdStrike’s fiscal third quarter, subject to customary closing conditions. Israel-based Bionic was founded 2019 and has raised $83m in funding to date.

Tue, 26 Dec 2023 01:00:00 -0600 en-gb text/html
NuHarbor Security Partners with RavenTek to Offer Streamlined Splunk Services No result found, try new keyword!to offer a unified approach to Splunk security and IT observability to public sector clients. NuHarbor and RavenTek are both Elite partners in the Splunk partner ecosystem, and the combined ... Wed, 29 Nov 2023 19:00:00 -0600 Splunk Inc.

Stocks: Real-time U.S. stock quotes reflect trades reported through Nasdaq only; comprehensive quotes and volume reflect trading in all markets and are delayed at least 15 minutes. International stock quotes are delayed as per exchange requirements. Fundamental company data and analyst estimates provided by FactSet. Copyright 2019© FactSet Research Systems Inc. All rights reserved. Source: FactSet

Indexes: Index quotes may be real-time or delayed as per exchange requirements; refer to time stamps for information on any delays. Source: FactSet

Markets Diary: Data on U.S. Overview page represent trading in all U.S. markets and updates until 8 p.m. See Closing Diaries table for 4 p.m. closing data. Sources: FactSet, Dow Jones

Stock Movers: Gainers, decliners and most actives market activity tables are a combination of NYSE, Nasdaq, NYSE American and NYSE Arca listings. Sources: FactSet, Dow Jones

ETF Movers: Includes ETFs & ETNs with volume of at least 50,000. Sources: FactSet, Dow Jones

Bonds: Bond quotes are updated in real-time. Sources: FactSet, Tullett Prebon

Currencies: Currency quotes are updated in real-time. Sources: FactSet, Tullett Prebon

Commodities & Futures: Futures prices are delayed at least 10 minutes as per exchange requirements. Change value during the period between open outcry settle and the commencement of the next day's trading is calculated as the difference between the last trade and the prior day's settle. Change value during other periods is calculated as the difference between the last trade and the most exact settle. Source: FactSet

Data are provided 'as is' for informational purposes only and are not intended for trading purposes. FactSet (a) does not make any express or implied warranties of any kind regarding the data, including, without limitation, any warranty of merchantability or fitness for a particular purpose or use; and (b) shall not be liable for any errors, incompleteness, interruption or delay, action taken in reliance on any data, or for any damages resulting therefrom. Data may be intentionally delayed pursuant to provider requirements.

Mutual Funds & ETFs: All of the mutual fund and ETF information contained in this display, with the exception of the current price and price history, was supplied by Lipper, A Refinitiv Company, subject to the following: Copyright 2019© Refinitiv. All rights reserved. Any copying, republication or redistribution of Lipper content, including by caching, framing or similar means, is expressly prohibited without the prior written consent of Lipper. Lipper shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Cryptocurrencies: Cryptocurrency quotes are updated in real-time. Sources: CoinDesk (Bitcoin), Kraken (all other cryptocurrencies)

Calendars and Economy: 'Actual' numbers are added to the table after economic reports are released. Source: Kantar Media

Sat, 15 Apr 2023 16:03:00 -0500 en text/html
Splunk (SPLK) Rose on Acquisition News

ClearBridge Investments, an investment management firm, released its third-quarter 2023 “Mid Cap Growth Strategy” investor letter, a copy of which can be downloaded here. The strategy underperformed its benchmark Russell Midcap Growth Index in the quarter. Overall, the effects of stock selection impacted the performance on a relative basis. The strategy gained three of the 11 sectors it was invested during the quarter on an absolute basis. In addition, please check the fund’s top five holdings to know its best picks in 2023.

ClearBridge Mid Cap Growth Strategy highlighted stocks like Splunk Inc. (NASDAQ:SPLK) in the Q3 2023 investor letter. Headquartered in San Francisco, California, Splunk Inc. (NASDAQ:SPLK) is a cloud solutions and software provider. On December 13, 2023, Splunk Inc. (NASDAQ:SPLK) stock closed at $152.04 per share. One-month return of Splunk Inc. (NASDAQ:SPLK) was 0.65%, and its shares gained 65.98% of their value over the last 52 weeks. Splunk Inc. (NASDAQ:SPLK) has a market capitalization of $25.625 billion.

ClearBridge Mid Cap Growth Strategy made the following comment about Splunk Inc. (NASDAQ:SPLK) in its Q3 2023 investor letter:

"Stock selection in the IT sector was the main detractor during the quarter, as the prospect of a higher-for-longer rate environment weighed on longer-duration, higher growth stocks. Despite these challenges, there were also strong positive contributions from Splunk Inc. (NASDAQ:SPLK) and AppLovin. Splunk’s price rallied on the news of its acquisition by Cisco, and investors applauded AppLovin’s new, AI-enabled platform which is improving productivity in its mobile games ad network."

An experienced software developer architecting a cloud software solution on multiple monitors.

Splunk Inc. (NASDAQ:SPLK) is not on our list of 30 Most Popular Stocks Among Hedge Funds. As per our database, 67 hedge fund portfolios held Splunk Inc. (NASDAQ:SPLK) at the end of third quarter which was 50 in the previous quarter.

We discussed Splunk Inc. (NASDAQ:SPLK) in another article and shared Carillon Clarivest Capital Appreciation Fund's views on the company. In addition, please check out our hedge fund investor letters Q3 2023 page for more investor letters from hedge funds and other leading investors.

Suggested Articles:

Disclosure: None. This article is originally published at Insider Monkey.

Wed, 13 Dec 2023 10:01:00 -0600 en-US text/html

SPLK-1003 testing | SPLK-1003 information | SPLK-1003 benefits | SPLK-1003 study help | SPLK-1003 study help | SPLK-1003 PDF Download | SPLK-1003 study tips | SPLK-1003 study tips | SPLK-1003 guide | SPLK-1003 education |

Killexams exam Simulator
Killexams Questions and Answers
Killexams Exams List
Search Exams
SPLK-1003 exam dump and training guide direct download
Training Exams List