Personal computer sales recently began to slow down to levels not seen since early 2020, before the pandemic turbocharged the PC space.
By all estimates, PC manufacturers are battening down the hatch for yet another challenging year after shipments fell markedly in 2022.
The pandemic-fueled sugar high of PC shipments that the industry saw in 2020 and 2021 began dissipating last year. Most industry analysts are not projecting a complete recovery until 2024.
To wrap some numbers around the severity of this decline, Gartner reported worldwide PC shipments fell almost 29% in Q4 of 2022, the most significant drop in nearly three decades.
Market demand for PCs for enterprise buyers, always a good barometer of the PC industry’s health, signaled the slowdown in Q3 2022 by extending PC lifecycles and postponing purchases. Gartner also reported that OEM PC shipments were somewhere in the 65.3 million to 67.2 million range in the Q4 of 2022, with worldwide sales in 2022 coming in at 286.2 million, a 16% decline from 2021.
The PC market surge in 2020 and 2021 was primarily the function of a world that had to quickly adjust to working from home and remote schooling behavioral changes. Businesses and consumers reacted by gobbling up PCs (particularly laptops) at a brisk pace not seen in years. Supply chain issues caused by the pandemic led to long lead times for PCs and critical videoconferencing peripherals like webcams.
Because pre-pandemic conditions have returned for the most part, demand has chilled despite average selling prices declining as PC OEMs resorted to the age-old tactic of discounts and promo offers to clear a glut of high inventory levels.
In short, the salad days of PC market growth appear to be over for the next couple of years.
Even as the major PC OEMs trim expenses in anticipation of another challenging year, a strong headwind that could retard the PC industry’s return to growth is Microsoft’s Windows 365. It’s not nearly getting the level of attention I believe it merits.
Still, given the impact of persistent inflation on IT and perhaps even consumer spending, Windows 365 may get much more appealing. The revenue impact of this could be devastating for traditional PC makers.
Arguably, Windows 365 is the future of the PC for several reasons.
First, this new cloud-based capability from Microsoft offers a comprehensive suite of productivity tools critical for businesses and individuals. These tools include Office 365, which provides access to popular applications such as Outlook, Word, Excel, and PowerPoint, and cloud storage services like OneDrive.
This attribute means that users can work from anywhere, on any device, and have access to their documents, files, and presentations no matter where they are.
Another compelling reason why Windows 365 is attractive to enterprise and business users is that it is constantly evolving. Unlike software packages that require a large and expensive IT staff to manage, the platform is updated regularly with new features, bug fixes, and security enhancements, ensuring that users always have access to the latest tools and technologies.
This feature allows users to stay ahead of the curve on perpetual updates and instead focus on their productivity requirements.
In addition to its productivity tools, Windows 365 also provides robust security features that have numerous advantages over a traditional hardware device model.
With advanced threat protection, device management, and data protection that can be managed and controlled in the cloud, sensitive data can be secured and protected more consistently. This need is vital for CIOs and CSOs who struggle with updating large fleets of laptops with the latest security fixes and patches and application management.
Finally, Windows 365 has the potential to become more cost-effective than purchasing traditional PCs. With a subscription-based model, users only pay for what they use and can quickly scale up or down as their needs change. This element makes Windows 365 an ideal solution for businesses of all sizes — and individuals — allowing them to manage their costs and stay within budget.
With average wireless broadband speeds now climbing north of 200 Mbps in homes, offices, and even in public settings as fiber and 5G becomes more pervasive, the burden of computing speed shifts to the cloud and away from the local device itself. This consequence has enormous hardware cost savings implications.
The idea of running an operating system in the cloud is not new. Google has been doing this for years with Chromebook and its ChromeOS. First announced in 2011, a Chromebook relies heavily on web applications using the Google Chrome browser.
Chromebooks can work offline and utilize applications like Gmail, Google Calendar, and Google Drive to synchronize data when connected to the internet.
While Chromebooks have enjoyed significant success with educational institutions due to their low cost, their penetration into the enterprise space has been lackluster. The enormous dependence that corporate users have on using Windows-based applications has been a major hurdle for Chromebook to overcome, despite its significantly lower cost versus a traditional Windows-based laptop.
The Chromebook 3110 Laptop is designed for the education market. (Image Credit: Dell)
Ironically, the brutal truth is that most users depend on Microsoft 365 (Word/Outlook/Excel/PowerPoint) for word processing, email, spreadsheet modeling, and presentation needs. Google Workspace may be a great collaborative workplace platform, but many businesses (and even schools) have standardized on Microsoft 365 and OneDrive.
To bridge this gap, Microsoft and Google have joined forces to make Microsoft 365 a premium experience on ChromeOS, with full OneDrive integration for the Files app in the cards. Google recently posted an update that will bring this functionality to Chromebooks over the next several months.
However, for most enterprise users who require access to Windows-based apps where significant dollars have been invested over decades, this remains a “close but no cigar” situation, with Chromebooks remaining a non-starter for many large enterprise customers.
Not too long ago, PC manufacturers could rely on Microsoft to release a new version of Windows to spur an upsurge in PC sales. As a marketing executive with Compaq who attended the famous Windows 95 announcement in Redmond nearly 30 years ago, I witnessed the excitement — and even hysteria — that Microsoft was able to conjure up.
While Apple remains one of a tiny number of companies that can still summon that level of excitement on a massive scale, Microsoft’s ability to replicate that level of user enthusiasm is now in the rear-view mirror. I don’t mean that derisively. Microsoft is a much different and larger company today than in 1995, and it remains one of the world’s great financial successes from a revenue and profitability standpoint.
But now, Microsoft realizes that connectivity improvements from a speed and latency standpoint over the past 20 years have made a “Windows in the cloud” scenario plausible and likely.
One could debate if a cloud-based Windows experience were sufficiently satisfying for all but the most demanding PC users, e.g., video content professionals and gamers.
Still, it’s hard to repudiate that the cloud is the ultimate end game, and it’s just a matter of time. Also, we cannot dismiss that the hybrid work phenomenon has played a role in making “Windows in the cloud” more appealing to business users.
Simply stated, Windows 365 abolishes the need to buy a new PC.
While cost will always be a consideration for IT departments, Microsoft’s monthly subscription price for businesses spans from $32 to $162 per user, depending on the number of processor cores, memory, and storage capacity.
Companies looking to get out of the hardware management business might find Windows 365 very tempting. Of course, you’ll still need a lightweight hardware computing client, but it most likely won’t be a $1,500 laptop.
Assuming that even a modest adoption of Windows 365 occurs over the next two or three years, the reduction in PC application service providers (ASP) will shake the rafters in the offices of every major PC manufacturer around the world — especially in Palo Alto/Houston, Austin, and Raleigh, homes of the three largest PC OEMs with a combined market share of more than 60%.
Even if only 20% of the PC market transitions to a Windows 365 business model over the next few years, the resulting impact due to dramatically lower ASPs could result in billions of dollars in lost revenue. That’s serious stuff.
I’m aware of the enormous internal resistance that a PC OEM must face to confront this reality. Some OEMs will espouse a view that Windows 365 offering will not be a fait accompli and that the declining ASP reduction transition can be gracefully managed.
I’m dubious that scenario will play out as it will only take a few companies, assuming the user experience with Windows 365 is reasonable and doesn’t impact overall productivity, to start messaging to the world how beneficial a cloud-based Windows approach is from a support, security, and image management standpoint.
From my perspective, the wiser PC OEMs will try to get ahead of this. Yes, there will be ASP declines, but there will also be an opportunity to define the experience consistent with the company’s brand, value proposition, and security offerings that distinguish it from competitors.
But let’s be clear: the anticipated declines in PC OEM ASPs as Windows 365 gains traction will probably not hit Chromebook-like levels because enterprise customers will value 4K displays, 5G capability, and other “care abouts” as the bill of materials emphasis shifts away from legacy processor players like AMD and Intel.
Notably, Qualcomm could benefit because of its line of low-cost Snapdragon SOCs that are tailor-made for Windows 365-class “thin” clients.
In addition, while PC clients may become more lightweight from a local processor, storage, and even graphics standpoint, those PC OEMs with world-class industrial design capability will have a leg up. These clients may have significantly lower price points, but end users won’t forego great-looking designs.
While this conversation has focused primarily on enterprise and corporate businesses that tend to purchase PCs in fleets, the consumer market may be next on Microsoft’s agenda. Some media reports state that Microsoft might offer Windows 12 to consumers via a low-cost Windows 365 subscription.
If anything, Windows 365 will presumably extend the PC’s traditional hardware lifecycle beyond the current two to three years that most businesses use before replacing a conventional PC. If these new lower-priced PC clients facilitate significantly longer lifecycles, the impact on PC market revenue could be enormous.
Indeed, 2023 may be the year where we see this transition begin to occur. While the financial pain in lower revenues will not likely be unavoidable, the brave PC OEMs who get in front of this freight train have the best chance to exploit the opportunity and reinvent themselves.
In the late 1990s, desktop PC pricing started to decline precipitously due to strong market competitiveness, sending most PC OEMs into a tizzy as ASPs began to creep under the then-unheard-of $1,000 price band.
Keep in mind that Microsoft’s current Windows 365 pricing for business customers is still expensive when considering that it doesn’t include any hardware, only the license to use Windows 365, some productivity software, and product support.
It remains to be seen what price point PC OEMs will have to offer for a laptop client to make a Windows 365 monthly subscription viable from a cash outlay standpoint versus a traditional PC purchase that includes Windows 11.
Compaq dealt with this challenge by offshoring its PC manufacturing, designing PCs with lower-cost components, aggressive procurement, and implementing lifecycle management to a “just in time” inventory model that helped maximize margins.
Despite the tumult this caused at Compaq, the overall mantra was that while no company likes it when a market gets cannibalized, a company is much better off when it cannibalizes itself rather than letting competitors do it.
That might be the most potent lesson to embrace for the leading PC OEMs from the late 1990s.
ICT | DIGITAL | SKILLING
KAMPALA - The information ministry has signed a memorandum of understanding (MoU) with International Computer Driving Licence (ICDL) Africa.
According to a press statement issued Wednesday (January 18) by the information ministry, the memorandum is aimed at enhancing digital skills training across the different structures in Uganda’s education system.
The MOU is also aimed at promoting the re-skilling and upskilling of the Government workforce to acquire appropriate digital skills to support service delivery.
ICDL general manager Solange Umulisa said the programme will also support the realisation of the digital Uganda Vision.
Umulisa, however, added that the MOU is expected to be implemented in five years through a collaborative effort.
“This is part of the wider government engagement with key partners that are geared towards enhancing the government’s ability to deliver e-services to the citizens of Uganda as envisioned in the Digital Uganda Vision," the ICDL general manager said.
Information ministry permanent secretary Dr Aminah Zawedde said: "This is a significant milestone for the Ministry of ICT & NG guidance in our quest to ensure that government employees and the citizens have the requisite skills to support and use the e-Government initiatives".
What is behind dark energy—and what connects it to the cosmological constant introduced by Albert Einstein? Two physicists from the University of Luxembourg point the way to answering these open questions of physics.
The universe has a number of bizarre properties that are difficult to understand with everyday experience. For example, the matter we know, consisting of elementary and composite particles building molecules and materials, apparently makes up only a small part of the energy of the universe. The largest contribution, about two-thirds, comes from "dark energy"—a hypothetical form of energy whose background physicists are still puzzling over.
Moreover, the universe is not only expanding, but also doing so at an ever-faster pace. Both characteristics seem to be connected, because dark energy is also considered a driver of accelerated expansion. Moreover, it could reunite two powerful physical schools of thought: quantum field theory and the general theory of relativity developed by Albert Einstein. But there is a catch: calculations and observations have so far been far from matching.
Now two researchers from Luxembourg have shown a new way to solve this 100-year-old riddle in a paper published by the journal Physical Review Letters.
"Vacuum has energy. This is a fundamental result of quantum field theory," explains Prof. Alexandre Tkatchenko, Professor of Theoretical Physics at the Department of Physics and Materials Science at the University of Luxembourg. This theory was developed to bring together quantum mechanics and special relativity, but quantum field theory seems to be incompatible with general relativity. Its essential feature is that, in contrast to quantum mechanics, the theory considers not only particles but also matter-free fields as quantum objects.
"In this framework, many researchers regard dark energy as an expression of the so-called vacuum energy," says Tkatchenko: a physical quantity that, in a vivid image, is caused by a constant emergence and interaction of pairs of particles and their antiparticles—such as electrons and positrons—in what is actually empty space.
Physicists speak of this coming and going of virtual particles and their quantum fields as vacuum or zero-point fluctuations. While the particle pairs quickly vanish into nothingness again, their existence leaves behind a certain amount of energy. "This vacuum energy also has a meaning in general relativity," the Luxembourg scientist notes. "It manifests itself in the cosmological constant Einstein included into his equations for physical reasons."
Unlike vacuum energy, which can only be deduced from the formulae of quantum field theory, the cosmological constant can be determined directly by astrophysical experiments. Measurements with the Hubble space telescope and the Planck space mission have yielded close and reliable values for the fundamental physical quantity.
Calculations of dark energy on the basis of quantum field theory, on the other hand, yield results that correspond to a value of the cosmological constant that is up to 10120 times larger—a colossal discrepancy, although in the world view of physicists prevailing today, both values should be equal. The discrepancy found instead is known as the "cosmological constant enigma." "It is undoubtedly one of the greatest inconsistencies in modern science," says Alexandre Tkatchenko.
Together with his Luxembourg research colleague Dr. Dmitry Fedorov, he has now brought the solution to this puzzle, which has been open for decades, a significant step closer. In a theoretical work, the results of which they recently published in Physical Review Letters, the two Luxembourg researchers propose a new interpretation of dark energy. It assumes that the zero-point fluctuations lead to a polarizability of the vacuum, which can be both measured and calculated.
"In pairs of virtual particles with an opposite electric charge, it arises from electrodynamic forces that these particles exert on each other during their extremely short existence," Tkatchenko explains. The physicists refer to this as a vacuum self-interaction. "It leads to an energy density that can be determined with the help of a new model," says the Luxembourg scientist.
Together with his research colleague Fedorov, they developed the basic model for atoms a few years ago and presented it for the first time in 2018. The model was originally used to describe atomic properties, in particular the relation between polarizability of atoms and the equilibrium properties of certain non-covalently bonded molecules and solids. Since the geometric characteristics are quite easy to measure experimentally, polarizability can also be determined via their formula.
"We transferred this procedure to the processes in the vacuum," explains Fedorov. To this end, the two researchers looked at the behavior of quantum fields, in particular representing the "coming and going" of electrons and positrons. The fluctuations of these fields can also be characterized by an equilibrium geometry which is already known from experiments. "We inserted it into the formulas of our model and in this way ultimately obtained the strength of the intrinsic vacuum polarization," Fedorov reports.
The last step was then to quantum mechanically calculate the energy density of the self-interaction between fluctuations of electrons and positrons. The result obtained in this way agrees well with the measured values for the cosmological constant. "[This means] dark energy can be traced back to the energy density of the self-interaction of quantum fields," says Alexandre Tkatchenko.
"Our work thus offers an elegant and unconventional approach to solving the riddle of the cosmological constant," sums up the physicist. "Moreover, it provides a verifiable prediction: namely, that quantum fields such as those of electrons and positrons do indeed possess a small but ever-present intrinsic polarization."
This finding points the way for future experiments to detect this polarization in the laboratory as well, say the two Luxembourg researchers. "Our goal is to derive the cosmological constant from a rigorous quantum theoretical approach," says Dmitry Fedorov. "And our work contains a recipe on how to realize this."
He sees the new results obtained together with Alexandre Tkatchenko as the first step towards a better understanding of dark energy—and its connection to Albert Einstein's cosmological constant. Finally, Tkatchenko is convinced that "in the end, this could also shed light on the way in which quantum field theory and general relativity theory are interwoven as two ways of looking at the universe and its components."
More information: Alexandre Tkatchenko et al, Casimir Self-Interaction Energy Density of Quantum Electrodynamic Fields, Physical Review Letters (2023). DOI: 10.1103/PhysRevLett.130.041601
Citation: A new approach to solving the mystery of dark energy (Update) (2023, January 24) retrieved 19 February 2023 from https://phys.org/news/2023-01-approach-mystery-dark-energy.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
Check out all the on-demand sessions from the Intelligent Security Summit here.
For brands hunting for customer insights to drive decision-making, Strengthen customer experience (CX), and ultimately spur growth, market research has long been part of the toolkit.
Whether it’s actually helpful or not is another question. In a typical market research project, brands invest (often heavily) in conducting research that amounts to a one-time snapshot of existing customer sentiment and, perhaps, competitors’ prevailing differentiators. While this research can yield useful insights, it usually fails to recognize the wants of potential customers, or adequately correlate data that reveals exactly why customers are with competitors.
Brands can easily miss the forest for the trees when relying on traditional market research. They get bogged down in addressing complaints while missing out on the fundamental reasons for why a customer chooses one brand over another. At the same time, market research projects are prohibitively expensive to repeat with regularity, and offer limited insights that begin to go stale from the moment research is completed.
Some marketers instead leverage social listening platforms for more continuous analysis of customer behavior (and customer engagement with specific features or brand offers). This strategy can collect useful customer reviews and feedback, and tends to be much more affordable than commissioning one-off market research studies.
Intelligent Security Summit On-Demand
Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.
However, this approach still leaves marketers blind to competitive activity and the adjustments that are best poised to win over those potential customers. Social listening platforms also require largely manual processes to sift insights from firehouse data. Talented data analyst teams doing this time-consuming work may very well identify correlations across that data, but that talent doesn’t come cheap. The shortcomings of both traditional market research and social listening platforms mean that rich opportunities to meaningfully and agilely Strengthen customer experiences regularly go undiscovered.
The answer to legacy market research and incomplete social listening platforms—as it is across the broader technology landscape — might very well be artificial intelligence (AI) and automation.
With AI deployed to round up continual marketing insights from the right data sources, brands can remove the guesswork from researching and correlating relevant customer experience data. AI-driven automation addresses the biggest limitations of traditional market research head-on: transforming the cost, cadence, and quality of insights collected. Marketers that would otherwise budget out expensive research projects periodically — and adjust their customer-facing practices only that often — should be seeking real-time, always-on insights that show clear correlations.
If traditional market research is like deciphering meaning from a still photograph taken at one moment in time, bringing AI and automation into this marketing practice is like allowing brands to leverage a continuous live video feed of shifts in customer needs and sentiment. Smart use of AI also curbs the need for expensive data teams, enabling marketers and business managers to directly implement insight-based improvements.
Simply put, analyzing customer sentiment data with AI reaches beyond the human capacity for recognizing correlations and customer trends. By collecting continuous marketing intelligence — including customer feedback across social media, review sites, surveys, service interactions and other touchpoints — a smart, AI-driven approach enables brands to be far more responsive and confident in aligning business practices with what customers actually want. Deploying an AI-centric strategy can then also perform the same analysis on competing businesses to discover useful insights, such as identifying practices that win those competitors’ positive customer sentiment and may be worthwhile to emulate.
For example, a hospitality business that implements AI-based customer sentiment data analysis might find that a direct competitor’s customers make many positive mentions calling out the hotel’s high-quality breakfast options. Automated analysis would then present this actionable insight as an easily digestible key takeaway: by investing in a breakfast menu that matches or exceeds the quality of that competitor’s, the brand has a likely path to a more satisfying customer experience, improved ratings, and long-term customer and revenue growth.
In the same way, a coffee chain might discover that competitors are winning positive customer sentiment for their variety of alternative milk options, and adapt their offerings to capitalize on that clear opportunity.
When harnessed correctly, small findings like these hidden within noisy data can nevertheless transform a brand’s competitiveness in their market.
Stas Tushinksiy is the CEO at Instreamatic.
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Thirty percent of Americans are overweight. There are many influences on overeating including genetic predispositions, overweight friends, and family. The good news is these influences can't force you to overeat since your actions come from your thinking and you can change your thinking.
In most (but by no means all) cases, being overweight results most directly from eating too much no matter what influences surround this problem. The simple (even if only partial) solution involves eating fewer calories. The difficult aspect involves putting this into practice.
Step 1. Apply the Problem Separation Technique (PST). Ask yourself: do I have an emotional problem such as anxiety, stress, guilt, or depression about my practical problem of eating too many calories?
Step 2. If the answer is yes, then determine if you have a secondary disturbance: a secondary disturbance means you're disturbing yourself about disturbing yourself. To put it another way, when your primary disturbance lies in your eating, Secondary disturbance means making yourself disturbed about eating too much and about gaining too much weight. Here's how it may manifest: suppose you notice you're eating more than is desirable. Primary disturbance: You tell yourself, "I must satisfy my craving for food right now and eat, I can't stand feeling deprived." Secondary disturbance: then as you are compulsively eating you think, "I must stop eating." You have a must about a must! Disturbing yourself about your problem doesn't help and only makes you feel worse. The demand often here is something like "I must not overeat." Demanding this of yourself is unrealistic, doesn't help, and only tends to make matters worse. A more realistic view would be something like, "I prefer not to overeat but clearly there's no law of the universe stating I must not. If I do, I do, I'll try to eat less next time."
Step 3. Remind yourself that you can accept yourself and still enjoy life and be productive with the extra pounds. Then why lose the weight? Because your life can be even better, satisfying, and more healthful without carrying around the extra pounds.
Step 4. Recognize eating too much and eating unhealthy food is a choice. You can choose to eat the ice cream or choose not to. You can choose to stop eating when you've had enough or choose to continue eating. You're not compelled to eat even though it may feel like you are and have no control. If you're skeptical imagine this: you are about to feast on your favorite ice cream. Someone shows up with a gun pointed at your head and threatens: "if you eat the ice cream I'll shoot." Clearly you decide to skip the ice cream.
Step 5. Go for unconditional self-acceptance (USA). You accept yourself unconditionally whether you overeat or you stay on your diet. Refuse to rate your total self based on the rating of your behavior. Whether you diet successfully or unsuccessfully you're still the same imperfect human who acts imperfectly. You always have been and you always will be.
Step 6. Go for unconditional life acceptance (ULA). You accept your life unconditionally with its frustrations, discomforts, and difficulties.
Practice implementing the above steps regularly. It's hard, but hard doesn't mean impossible. The good news: overeating is a choice. You can choose to stay on a reasonable eating regimen or choose not to. It's up to you!
Tile has a bunch of tracker sizes and shapes.
Life360's Bluetooth tagging device Tile is launching a new anti-theft mode designed to get around a tricky issue: criminals knowing when an item they've stolen is being tracked.
The new anti-theft mode makes a user's Tile undetectable in Scan and Secure, the company's in-app feature that allows iPhone and Android users to locate nearby Tiles. That feature was initially launched to combat the rise in stalking with Bluetooth tagging devices, but now the company say that approach hasn't stopped stalking and has, if anything, made a criminal's job easier to get away with.
"It seemed like the real problem of stalking wasn't actually being necessarily solved," said Life360 CEO Chris Hulls. "But what was happening was that there's this new vector that's opening up for thieves, where so much of the reason people buy Bluetooth tags to begin with is to protect their items from theft. And now, if you have a Tile or an AirTag, a smart thief will almost certainly be able to find it."
That's because if a thief learns there is a Bluetooth tag on an item that they stole, they're able to easily remove the tag and prevent victims from tracking down valuables.
Bluetooth tagging devices like Life360's Tile, Apple's AirTag and Samsung's Galaxy SmartTag have come under pressure to increase safety features as stalking cases rose. But robberies are also on the rise throughout the U.S., and according to Hulls, there's been a negligible number of stalking cases. During the first half of 2022, there was a 5.5% increase in robberies throughout the country, according to the Council on Criminal Justice.
"Theft is the primary reason people buy these products," Hulls said. "Our new anti-theft mode is a tradeoff. ... It gives consumers choice, they can turn off all the anti-stalking features, so their Tiles become invisible," he added.
Tile is taking a new approach to stalking prevention. If a user chooses to use the anti-theft mode, they must go through an ID verification process, which will register the user with Tile and sync their ID with their account.
"From our research, the real issue with stalkers is how do you remove anonymity?" he said.
Scan and Secure didn't address that issue, he said, and it didn't have any enforcement mechanism. That's changing now too, with Life360 threatening stalkers with a $1 million penalty the company will pursue against any individual convicted in a court of law for using Tile devices to illegally track any individual without their knowledge or consent. If someone is convicted in court for stalking using Tile, in which the ID verification makes prosecution easier, Tile would sue that person for violating the terms and services.
But there is a user privacy issue in the changes as well. Life360 is increasing collaboration with law enforcement so that if anti-theft mode is enabled, users must acknowledge that personal information can and will be shared with law enforcement at the company's discretion, even without a subpoena, to aid in the investigation and prosecution of suspected stalking.
"So much of why people buy these devices to begin with is to protect against theft. So, we think this really threads the needle elegantly," Hulls said.
Technology safety advocates aren't convinced.
Kathleen Moriarty, chief technology officer at the Center for Internet Security, said via email that a closer tie to identity should Strengthen the integrity of the solution by adding in a human factor and sense of responsibility, but as is the case with any new release, "limits and boundaries will be tested in unexpected ways."
For example, if a Tile can be discovered by its authorized owner, to track down the belonging in which it was placed, the Tile is emanating a signal to make that possible. Although the application designed for Tile use will only make that visible to the Tile owner, the signal may be discoverable with other tools.
"Without insight into the technical details, the protections to prevent detection are not clear. Typically confidence from technologists is higher for standards-based solutions due to the rigorous review process by a number of experts. The solution holds promise and time will tell as it gets tested in real-world scenarios," she said.
Adam Dodge, CEO of digital safety education company EndTAB and a member of the World Economic Forum's Digital Justice Advisory Committee, was more critical of the approach in an email exchange. "This update denies stalking victims access to the most essential of safety measures, the ability to verify and locate a tracker to stop the abuse," Dodge wrote.
He says Life360's new safety commitments — identify verification, fines and promises to cooperate with law enforcement — do not offset the increased risk created for victims, and he noted that the features and new company policies only come into play if the Tile tracker is found, something that anti-theft mode now makes much less likely.
"We have to remember that bad actors care most about getting caught. From that perspective, this new feature is a homerun," Dodge said. "I'd rather see Tile's new commitments to safety implemented on their own, as opposed to a way to blunt the risks created with anti-theft mode."
In his view, if the problem of stalking wasn't being solved, companies like Life360 should solve for it specifically through innovation and new safety features. "The irony is that, by making Tile trackers undetectable, it is now easier for stalkers to stay anonymous," Dodge wrote.
This story has been updated to include comments from Adam Dodge, CEO of EndTAB, and Kathleen Moriarty, chief technology officer at the Center for Internet Security.
The protein STAT5 has long been an appealing target against cancer, but after decades of research it was consigned to the "undruggable" category. Now, University of Michigan Rogel Cancer Center researchers have found success with a new approach.
By tapping into a cellular garbage disposal function, researchers found they could eliminate STAT5 from cell cultures and mice, setting the stage for potential development as a cancer treatment.
STAT5 plays a key role in how some blood cancers develop and progress. But efforts to identify a small molecule inhibitor to block STAT5 have been stymied. Previous research efforts have found it challenging to design a drug to bind to STAT5 with a high-affinity, a measure of how well they fit together. Even when a compound was found to bind with the protein, it may not make its way into cell and tissue. It's also difficult to find a compound that inhibits STAT5 only without affecting any of the other STAT proteins.
Shaomeng Wang, Ph.D., Warner-Lambert/Parke-Davis Professor in Medicine and professor of medicine, pharmacology and medicinal chemistry at the University of Michigan, had another idea.
His lab has been working on a new drug development approach targeting protein degradation. This is a naturally occurring function within cells to get rid of unwanted protein. Think of it as the garbage disposal: When a protein is no longer needed to keep a body healthy, this mechanism removes the unwanted or damaged protein from the cell.
Using this approach, Wang's lab identified a protein degrader, AK-2292, that targets and removes STAT5. The compound was highly specific to STAT5 with no effect on other STAT proteins. It was effectively taken up by both cell lines and mouse models and was found to stop cell growth in cell lines of human chronic myeloid leukemia (CML) and to induce tumor regression in mouse models of CML. Results are published in Nature Chemical Biology.
The protein degrader works by eliminating STAT5 proteins from tumor cells and tissues, unlike a small molecule inhibitor that would traditionally be designed to bind with the protein and interfere with its function.
"We've overcome some of the major issues that were barriers for scientists to target STAT5," Wang said. "People have worked in this field for the last 20 years, and there are no small molecules targeting STAT5 going into clinical development. This study shows us STAT5 can be targeted through a protein degradation approach. It's a new, exciting direction for developing a potential drug molecule targeting STAT5 for the treatment of cancers in which this protein plays a major role."
"This compound gives us a very solid foundation to do further optimization to identify a compound that we eventually can advance into clinical development," Wang added.
Wang's lab has been investigating protein degraders for several years and has a number of degraders in advanced preclinical development studies, which they hope will lead to clinical trials for the treatment of cancer in people.
More information: Atsunori Kaneshige et al, A selective small-molecule STAT5 PROTAC degrader capable of achieving tumor regression in vivo, Nature Chemical Biology (2023). DOI: 10.1038/s41589-022-01248-4
Citation: Researchers use a new approach to hit an 'undruggable' target (2023, February 15) retrieved 19 February 2023 from https://medicalxpress.com/news/2023-02-approach-undruggable.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
A panel of public health experts for the Food and Drug Administration convened for an all-day meeting on Thursday to discuss whether the U.S. is ready to start treating COVID boosters like an annual flu vaccine, a move that would normalize COVID shots as permanent public health upkeep.
Creating a yearly COVID shot schedule would mean updating the shot once a year and hoping that it matches whichever variant is circulating that upcoming winter season, similar to how the flu shot schedule works. No longer would public health agencies and vaccine companies attempt to reactively update the COVID vaccine when variants come on the scene.
It would also eliminate the two-shot primary series for most unvaccinated people, in favor of a mainstreamed program.
"This is a consequential meeting to determine if we've reached a point in the pandemic that allows for simplifying the use of current COVID-19 vaccines," Dr. David Kaslow, director of the FDA's Office of Vaccines Research and Review, told the group of advisors on Thursday.
Dr. Peter Marks, the FDA's vaccine chief, said the virus has evolved so rapidly that the FDA has constantly needed to revise its approach. Now, he said, it was time to determine if they could do that on a schedule rather than at the whim of the virus' evolutions.
"We're now in a reasonable place to reflect on the development of the COVID-19 vaccines to date to see if we can simplify the approach to vaccination in order to facilitate the process of optimally vaccinating and protecting the entire population moving forward," Marks said.
The goal, FDA officials said, would be to deliver simpler messaging in the hope of reaching more Americans.
The way it stands now, people get different numbers of vaccines depending on their age and prior vaccination status. For example, unvaccinated people get their initial series in two shots, a few weeks apart, while vaccinated people have been encouraged to get a booster every few months, depending on how old they are.
The plan proposed by the FDA is to recommend every American get a single shot every year, with a few exceptions.
It's based on the idea that the vast majority of Americans already have some immunity in their system, either because they've been exposed to COVID itself or gotten vaccinated.
"Presumably, at this point in the pandemic, most of the general U.S. population have been sufficiently exposed to spike protein, either to infection, vaccination or both, such that a single dose of a COVID-19 vaccine would induce or restore vaccine effectiveness," Kaslow said.
But older, high-risk and immunocompromised Americans could still be recommended to get two shots a year, as could young children when they reach the age eligible for vaccinations.
"It could be that more than one dose is needed for high-risk older adults, persons with compromised immunity and young children who have not yet been completely immunized. At this time, those risk based adjustments remain to be determined," Kaslow said.
Responding to the proposal, some experts on the panel emphasized that people who haven't had COVID or been vaccinated might still need to get two shots to shore up their immunity.
But the members were broadly supportive of the move to simplify the COVID vaccine schedule.
"I certainly support this approach. Simpler is better," said Dr. Mark Sawyer, a professor of clinical pediatrics at the University of California, San Diego and a member of the FDA's panel.
"I think this is definitely the way to go as soon as we can figure out how to do it," he said.
More data is necessary to decide exactly who will be protected enough by one dose, experts said.
"My general feeling from the committee is that we need more data to figure out exactly who should get the two-dose schedule, who should get the one," said Dr. Stanley Perlman, an infectious disease expert with the University of Iowa and a member of the panel.
"So all that kind of information will help determine this immunization schedule, but in general principle, the committee was supportive of going forward with this," Perlman said.
The FDA will likely not make any concrete decisions until later in February, after a panel of experts at the Centers for Disease Control and Prevention has a similar public discussion on the newly-proposed schedule.
And just which variants the updated booster would target for next season won't be decided until this summer, likely in early June, for distribution sometime in the fall.
Of course, the reality is that a small percentage of Americans have opted to get the current booster — around 16%, despite recent CDC research showing that it can cut risk of symptomatic infection by around half. And because less and less people have opted for a booster shot each time one has been recommended, the appetite could be lower by next fall.
But Marks, the FDA's vaccine chief, was optimistic that aligning the COVID vaccine on a yearly schedule with the flu shot could encourage more people to make it a routine.
"If we can see the influenza vaccine and the COVID-19 vaccines occurring at the same visit, it facilitates a vaccination program that may lead to more people getting vaccinated and being protected, and reducing the amount of disease we see," Marks said.
"So I think that overall, this seems like a reasonable way to go."
ABC News' Nicole Wetsman and Youri Benadjaoud contributed to this report.