Apple Inc. (NASDAQ:AAPL) stock has had a troublesome tryst with technicals as of late. The company's share price has been moving lower in a downward broadening channel (bearish megaphone pattern) throughout 2022. While some investors may see Apple trading ~20% below its all-time highs as a great dip buying opportunity, others may be concerned about further downside risk in the stock. The megaphone pattern often results in significant moves (to the upside or downside depending on the bullish or bearish nature of the megaphone), and hence, investor caution here is warranted.
Despite a challenging macroeconomic backdrop and extremely tough comps, Apple's business performance has remained steadfast so far this year. However, things may be about to change soon, with recent production issues in China (i.e., strikes at Foxconn's plant in China and subsequent scale-up problems) estimated to create a shortfall of ~5-20M iPhones, which would, in turn, lead to lower sales and profits. While these fluctuations are temporary in nature, the need for Apple to diversify its supply chain away from China has become obvious, and this transition could lead to lower margins in the future. But I digress.
In this note, we will focus solely on Apple's technical chart to gauge the next big move in Apple's stock.
As I said earlier, Apple's stock chart is showing a bearish megaphone pattern, which could portend to outsized moves to the downside - like the one we saw in S&P500 (SPY) back in 2008-09. Back in mid-2008, SPY was sitting ~20% off its highs and displayed a similar pattern to what we see currently in Apple's stock. What followed next for S&P500 was a -50% decline!
Now, I am not saying that Apple will slide down by 50% from current levels just because S&P500 did so back in 2008-09. All I am saying is that such a megaphone pattern could result in a significant move to the downside. Now, let us look a little deeper into Apple's chart.
Within the bearish megaphone pattern, Apple has formed a symmetric triangle pattern in latest weeks. The symmetric triangle pattern in stocks is when the prices of a security move in a symmetrical fashion. The prices will move higher and lower, but the overall trend will be sideways. This occurs as the buyers and sellers battle for control of the security. The symmetric triangle pattern in stocks is a consolidation pattern that usually forms during a downtrend. The pattern is made up of two trendlines, one downward sloping & one rising, that converge as the price action within the pattern tightens. Once the price breaks out of the triangle, it typically moves in the direction of the previous trend, which in this case, is downwards. Theoretically, the next big move in Apple's stock should be to the downside on a breakdown of this symmetric triangle pattern.
Now, technical analysis is all about probabilities, and Apple could break down from here or break out to the upside and run higher! As investors or traders, it is our responsibility to evaluate the risk/reward of given setups and make informed investment decisions.
In Apple's case, a breakout to the upside from the $150 level could send the stock rallying up to $160-165, i.e., the upper trendline of the downward broadening channel. On the other hand, if Apple's stock suffers a breakdown of the $134 level, it could quickly slide down to the $80-110 zone. Clearly, the downside risk is far greater than the upside potential in the near term.
Alright, it's time to zoom into the chart and see what's going on underneath the surface. As you can see below, all of Apple's latest local peaks have been rejections from the 200-DMA level, which is acting as strong resistance for the stock. This is a bearish signal indicating that Apple's stock lacks momentum.
In my view, the resolution of the symmetrical triangle pattern is likelier to be a breakdown to the downside than a breakout to the upside. Technically, the near-term (< 12 months) outlook for Apple's stock is bearish.
As an investor, I like to buy and hold companies for the long haul, and Apple is an incredible business to own. That said, I have reservations about its current valuation, as shared in multiple articles in the past. Apple is one of the strongest consumer discretionary businesses on the face of this planet; however, it ain't a recession-proof business or stock and shouldn't be priced like a utility (as it's being priced right now).
Today's technical analysis exercise showed that - Apple's stock is ripe for a big near-term move within the bearish megaphone pattern formed on the chart. A breakout to the upside from the symmetrical triangle pattern (~$150 level) could send Apple's stock soaring up to ~$165-170. On the other hand, a breakdown of the $134 level could send the stock tumbling down to the $80-110 zone. Clearly, the risk/reward here is tilted in favor of bears. Also, we saw that Apple's stock had been rejected from the 200-DMA level multiple times in latest weeks, and that's a bearish signal.
If you follow my work, you know that I do not take short positions in individual stocks, and I am not going to start now (especially with Apple). However, I see a precarious technical setup in Apple's chart, and I wouldn't buy here. The conclusion of today's technical analysis is in alignment with my fundamental view on Apple's stock, which can be found in this note: Is Apple Stock A Good Pick If The Economy Enters A Recession?
Key Takeaway: I rate Apple "Avoid/Neutral/Hold" at $141.
Thanks for reading, and happy investing. Please share your thoughts, questions, and/or concerns in the comments section below.
It’s rare to see people get excited about accessibility, but inside Apple’s Developer Academy in Naples, the sense of enthusiasm is palpable.
Sandwiched between weathered buildings in the suburb of San Giovanni a Teduccio, a neighbourhood that’s been nicknamed “the Bronx”, are a fresh cohort of more than 250 future iOS developers, eager and hungry to learn and build apps that make a positive impact on the world.
The decked-out halls in the University of Naples Federico II are disparate from the surrounding locality. Like a little Silicon Valley hidden away inside Napoli, you’ll find giant screens hooked up to Apple TVs everywhere you look, making it easy to AirPlay content from iPhones and MacBooks in a snap.
Tall sofas encase round tables in pods, providing groups of people with a sense of privacy, simultaneously forcing a sense of intimacy between the students. Then there’s the deliberate positioning of seats around a central column in a seminar room, all arranged in a way that’s supposed to make speakers feel uncomfortable.
In a workshop room, several groups of students are stood, chatting animatedly around large tables, space grey MacBooks and iPads in hand. They’ve just been set a challenge: dissect each other’s apps and weed out any potential issues they might find when it comes to usability and accessibility.
The school in Southern Italy has been a springboard for up-and-coming developers ever since it was launched in 2016. The first in Europe, its intention is to supply the next generation of programmers the tools they need to code, design, pitch and market apps for the App Store.
Now in its seventh year, the students – or learners, as Apple affectionately calls them – are just over a month into their year-long app development course, and they’ve already started learning about the importance of designing in a universally accessible way.
As a term, accessibility can be a little nebulous, but it boils down to making something usable by everyone, even if they’ve got a disability. Whether it’s something physical in the real-world – tactile pavements, talking ATMs, buildings – or increasingly commonly the digital world, like apps, websites, games or software, it’s about making sure everyone can use a product without any barriers or limitations.
Sometimes seen as boring or an after-thought, the “A” word can strike fear into the hearts of even the most established developers. But at the academy, Apple’s students are being taught to put accessibility first, from the moment they put pen to paper and start writing a line of code.
Just another layer of compliance to some, it’s one of Apple’s core values and a principal that the company has been championing for years.
Apple opened its first office of disability in 1985, five years before the Americans with Disabilities Act came into place.
But it really changed the game in 2009 when it ported VoiceOver onto the iPhone 3GS – a text-to-speech engine that worked on a touch-screen smartphone. It was revolutionary at the time because it allowed people with low vision to use a phone without a keyboard or dial pad, at a time when many blind people thought that making a touch-screen accessible was frankly impossible.
After the App Store rolled out, Apple also made sure that its burgeoning ecosystem of third-party developers had all the tools they needed to make their apps usable by as many people as possible. Apple’s students are taught how to use these tools right from the beginning of their one-year intensive course, instilling in them an immediate sense of importance around universal design.
Apple’s software development kit, which helps developers programme apps in Swift and Objective-C, does as much of the heavy lifting as possible when it comes to implementing accessibility elements into their apps. As an example, it bakes the accessibility inspector directly into XCode, the company’s software suite.
For Apple, that’s where it starts. “In downloading that software developer kit and understanding all of the components that Apple provides for making a great app, accessibility is built-in,” says Sarah Herrlinger, Apple’s global head of accessibility. “When our developer relations team talk to developers, [accessibility] is a big piece of the discussion of how you take an app from good to great,” she tells The Independent.
Back in the workshop room, students are scrutinising every detail of each other’s apps, interrogating all the accessibility elements: Are images and videos accessible? Do colours properly contrast for those with colour blindness? Why doesn’t an app work correctly with VoiceOver?
The students in the workshop here are all graduates who have gone through the academy process previously, but have been invited back for another year to help work on developing apps for non-government organisations. It’s real work, where they’ll help to solve real-world problems.
There are now over a dozen Apple developer academies across the globe. The company opened schools in South Korea and Detroit in 2021. At each hub, accessibility takes centre stage in the first few months of the course.
While not every student on the academy goes on to develop an app, several of them have been focused around making the world more accessible for people with disabilities. Roughly 200 apps have been developed since the academy first launched in Naples in 2016. Today, some of the alumni are demonstrating their apps to The Independent.
There’s an app called Hear Me Well. It uses your iPhone’s microphone to amplify the sounds and voices around you, piping that boosted sound through a pair of headphones or earbuds. It’s potentially useful for those with reduced hearing, whether they’re in a crowded bar or home in front of the TV.
There’s TruSteppy, an app that takes advantage of the iPhone’s TrueDepth front-facing camera to help people with low vision detect obstacles in front of them, vibrating when there’s an obstacle in the way.
Then there’s Dusk. This one’s not an accessibility aid, but an app packed with a series of mini games that are fully accessible for the blind, but can be played by anyone. Apple earlier this year released an open-source accessibility plugin that would make it easier for developers coding in Unity to make their games accessible.
These aren’t always features technically intended to be used for accessibility purposes, but the graduates have broken them apart and thought about them in interesting ways. The TrueDepth camera wasn’t intended to be turned outward and used to detect obstacles, for example, but the developer has repurposed the camera’s ability to build depth maps and capture infrared images to do just that.
While Apple implemented Live Listen in 2018, which essentially does the same as Hear Me Well, and implemented a “conversation boost” accessibility feature into the AirPods in 2021, which uses the beam-forming microphones on the AirPods to boost the voice of the person in front of you, Hear Me Well works with all headphones, and can conduct an audio metric test inside the app.
An iOS feature might not have been intended to have an accessibility component attached to it when the feature was first cooked up, but those boundaries are being pushed by Apple’s grads, and accessibility is being thought about in innovative ways.
VoiceOver and beyond
The integration of VoiceOver in the iPhone in 2009 was just the start. Apple’s list of accessibility features is constantly growing. It’s easy to spend weeks scrolling through the accessibility settings, playing with all the features, discovering all the ways you can use its suite of products, from an iPhone to an Apple Watch, and still only ever scratch the surface.
There are features for people with physical and motor difficulties. Switch controls and voice control features help people use their iPhone with a separate peripheral, or by simply talking out loud. Another feature alerts hard of hearing users through their iPhone if it recognises the sound of a fire alarm, running water or a baby crying, for instance.
Some accessibility features have also been adopted by non-disabled people. Back in the pre-iPhone X days, when iPhones had a home button, it was common for users to turn on Assistive Touch if their home button broke, because it allowed them to jump back to the home screen without having to click the physical button.
In iOS 14, Apple introduced another physical accommodation accessibility feature called Back Tap. Intended to help people with motor difficulties, it was quickly embraced by non-disabled individuals, making it easy to quickly launch complicated shortcuts, open apps or turn on specific settings.
Apple has had a hidden magnifier app inside the Settings app for some time, but it rolled out a dedicated app, installed as default on all iPhone devices in 2021.
When the iPhone 12 Pro launched with a LIDAR sensor in 2020, the company brought out a new feature called People Detection. It uses the People Occlusion feature in Apple’s ARKit in combination with AI, VoiceOver and the iPhone’s LiDAR sensor to identify the distance between the user and the person in front, simultaneously offering up a description of what that person looks like.
This year, the company released a Door Detection feature. It helps users detect the distance between them and the door, telling them whether it’s open or closed, and whether it’s a push, pull or a twist of the knob. It’s also recently launched a live captions feature. Still technically in beta, it transcribes spoken and on-device audio in real-time. Plus, there are new features for controlling your Apple Watch using gestures or remotely via a paired iPhone.
When Apple launched the People Detection feature, it also created APIs for developers so that they could make use of the feature inside their own apps, taking advantage of the new LIDAR functionality on the iPhone. AR is a promising area in the tech world, but it has yet to be fully exploited when it comes to accessibility, and right now, people still have to tether themselves to their phones to take advantage of the tech.
“I think we will have to see what AR brings given the form factors we have available today. Being able to take advantage of LIDAR has been a real gamechanger for the blind community, but I think time will tell where LIDAR goes, whether that’s people who would ask to put into other models or other things, so I think we’ll see,” says Herrlinger.
She notes that she has already seen interesting developments in augmented reality for those on the autism spectrum, and imagines virtual reality implementations where wheelchair users, who might never have the chance to scuba dive, can swim with blue whales. “I think the sky’s the limit at the end of the day on technology.”
Nothing about us without us
There is a phrase popular in disability rights activism circles: nothing about us without us. In the context of disability, it means that whenever something is made for disabled people, it should have the input of disabled people. Without it, and still all too often, projects score media coverage, assistive tech devices are reinvented over and over, and are sometimes unmasked as mere vapourware.
Over the years, non-disabled people have attempted to make assistive aids for disabled people, but have been criticised for not consulting disabled people when developing those aids. “One thing that’s pretty recurrent to me are apps that facilitate interactions for sign language. Those videos go viral on Instagram and LinkedIn,” remarks Olivier Jeannel, who is profoundly deaf and the founder of Rogervoice, an app that helps facilitate phone calls for people who are hard of hearing. Rogervoice provides closed-captions conversations on the fly, empowering deaf people to have conversations over the phone.
“There are very talented developers working on sign language recognition, but they’re not for deaf people. And it’s pretty obvious that they haven’t consulted the deaf community on the topic,” Jeannel adds.
When he founded Rogervoice through a Kickstarter campaign in 2014, he, of course, was his own target market, but he also contacted his local association of deaf and hard of hearing people, who put him in touch with others to provide feedback on his prototypes. “That was key,” he says. “Apps should always have the end user in mind. It’s not at the start, it’s not at the end, it’s throughout the lifecycle of an app.”
While it’s unclear if there are any students who identify with a disability in this year’s cohort of Apple Academy developers, the students are encouraged to work and engage with people who might become users of their apps when developing their ideas.
The academy has close ties to local NGOs, for example, with students working on apps designed for the blind in collaboration with the Italian Union of Blind and Partially sighted people.
Ultimately, for Apple, accessibility isn’t something that’s bolted on to the back of a product, hoping it ticks all the usability checkboxes required by law. At the academy, it’s clear that the design principle runs through the company’s veins.
Accessibility isn’t something that Apple has to do; accessibility is something the company believes it should do. “Our philosophy as a company is that accessibility is a basic human right,” Herrlinger affirms. “For me, it’s not so much about it being inspirational as it is something that I believe we should all be championing for – internally and externally – as much as we can.”
From news to politics, travel to sport, culture to climate – The Independent has a host of free newsletters to suit your interests. To find the stories you want to read, and more, in your inbox, click here.
Education pundits are a lot like the guy in the "Distracted Boyfriend" meme.
It's interpersonal relationships that guide the journey and even the most sophisticated chatbot can't do that yet and probably never will have that capacity.
They're walking with teachers but looking around at the first thing possible to replace them.
This weekend it's AI chatbots.
If you've ever had a conversation with Siri from Apple or Alexa from Amazon, you've interacted with a chatbot.
Bill Gates already invested more than $240 million in personalized learning and called it the future of education.
And many on social media were ready to second his claim when ChatGPT, a chatbot developed by artificial intelligence company OpenAI, responded in seemingly creative ways to users on-line.
It answered users requests to rewrite the 90s hit song, "Baby Got Back," in the style of "The Canterbury Tales." It wrote a letter to remove a bad account from a credit report (rather than using a credit repair lawyer). It explained nuclear fusion in a limerick.
It even wrote a 5-paragraph essay on the novel "Wuthering Heights" for the AP English exam.
Josh Ong, the Twitter user who asked for the Emily Bronte essay, wrote, "Teachers are in so much trouble with AI."
But are they? Really?
Teachers do a lot more than provide right answers. They ask the right questions.
They get students to think and find the answers on their own.
They get to know students on a personal level and develop lessons individually suited to each child's learning style.
That MIGHT involve explaining a math concept as a limerick or rewriting a 90's rap song in Middle English, but only if that's what students need to help them learn.
It's interpersonal relationships that guide the journey and even the most sophisticated chatbot can't do that yet and probably never will have that capacity.
ChatGPT's responses are entertaining because we know we're not communicating with a human being. But that's exactly what you need to encourage the most complex learning.
Human interaction is an essential part of good teaching. You can't do that with something that is not, in itself, human—something that cannot form relationships but can only mimic what it thinks good communication and good relationships sound like.
Even when it comes to providing right answers, chatbots have an extremely high error rate. People extolling these AI's virtues are overlooking how often they get things wrong.
Anyone who has used Siri or Alexa knows that—sometimes they reply to your questions with non sequiturs or a bunch of random words that don't even make sense.
ChatGPT is no different.
As more people used it, ChatGPT's answers became so erratic that Stack Overflow—a Q&A platform for coders and programmers— temporarily banned users from sharing information from ChatGPT, noting that it's "substantially harmful to the site and to users who are asking or looking for correct answers."
The answers it provides are not thought out responses. They are approximations—good approximations—of what it calculates would be a correct answer if asked of a human being.
The chatbot is operating "without a contextual understanding of the language," said Lian Jye Su, a research director at market research firm ABI Research.
"It is very easy for the model to supply plausible-sounding but incorrect or nonsensical answers," she said. "It guessed when it was supposed to clarify and sometimes responded to harmful instructions or exhibited biased behavior. It also lacks regional and country-specific understanding."
Which brings up another major problem with chatbots. They learn to mimic users, including racist and prejudicial assumptions, language and biases.
For example, Microsoft Corp.'s AI bot 'Tay' was taken down in 2016 after Twitter users taught it to say racist, sexist, and offensive remarks. Another developed by Meta Platforms Inc. had similar problems just this year.
Great! Just what we need! Racist, sexist Chatbots!
This kind of technology is not new, and has historically been used with mixed success at best.
ChatGPT may have received increased media coverage because its parent company, OpenAI, was co-founded by Tesla Inc. CEO Elon Musk, one of the richest men in the world.
Eager for any headline that didn't center on his disastrous takeover of Twitter, Musk endorsed the new AI even though he left the company in 2018 after disagreements over its direction.
However, AI and even chatbots have been used in some classrooms successfully.
Professor Ashok Goel secretly used a chatbot called Jill Watson as an assistant teacher of online courses at the Georgia Institute of Technology. The AI answered routine questions from students, while professors concentrated on more complicated issues. At the end of the course, when Goel revealed that Jill Watson was a chatbot, many students expressed surprise and said they had thought she was a real person.
This appears to be the primary use of a chatbot in education.
"Students have a lot of the same questions over and over again. They're looking for the answers to easy administrative questions, and they have similar questions regarding their subjects each year. Chatbots help to get rid of some of the noise. Students are able to get to answers as quickly as possible and move on," said Erik Bøylestad Nilsen from BI Norwegian Business School.
However, even in such instances, chatbots are expensive as yet to install, run and maintain, and (as with most EdTech) they almost always collect student data that is often sold to businesses.
Much better to rely on teachers.
You remember us? Warm blooded, fallible, human teachers.
The best innovation is still people.
STOCKHOLM (Reuters) -Ericsson said on Friday it had struck a global patent licence agreement with Apple, ending a row over royalty payments for the use of 5G wireless patents in iPhones.
The Swedish telecoms equipment maker said the multi-year deal included global cross-licences for patented cellular standard-essential technologies, and granted certain other patent rights.
"The settlement ends all ongoing patent-related legal disputes between the parties," it said in a statement.
The deal comes after Ericsson in January filed a second set of patent infringement lawsuits against the U.S. maker of iPhones.
Both companies had already sued each other in the United States as negotiations failed over the renewal of a seven-year licensing contract for telecoms patents first struck in 2015.
Ericsson sued first in October 2021, claiming that Apple was trying to improperly cut down the royalty rates. The iPhone maker then filed a lawsuit in December 2021, accusing the Swedish company of using "strong-arm tactics" to renew patents.
Ericsson on Friday forecast fourth-quarter intellectual property rights (IPR) licensing revenues of 5.5 billion-5.0 billion Swedish crowns ($530.3 million-$578.5 million) including the effects of the settlement, and including ongoing IPR business with all other licensees.
($1 = 10.3720 Swedish crowns)
(Reporting by Anna Ringstrom, Editing by Terje Solsvik and Mark Potter)
GoodNotes 5 is a popular iPad note-taking app. It allows you to write and draw on "digital paper" with your fingers or the Apple Pencil.
The app really focuses on creating handwritten, searchable notes and markups, so it's a little surprising there's a Mac version of it. How does an app seemingly designed for a tablet translate to a computer? And do the iPad and Mac versions work well together?
We decided to answer these questions ourselves and had some ultimately disappointing results. How bad was it? Read on to find out!
We easily downloaded GoodNotes 5 for Mac from the App Store, and it opened without issues. We were off to a good start.
Learning that the free version of the app only came with three notebooks wasn't an immediate turn-off. The notebooks seem to have infinite pages, so being restricted to three of them impacts the organization between the notebooks but not the number of notes you can take.
Seeing the page templates available in the notebooks was also kind of exciting. There are many different line types and formats to choose from, and you can even control the paper color.
The user interface is pretty intuitive here too, and easy to figure out without a tutorial.
Download: GoodNotes 5 for Mac (Free, premium version available)
The problems began once we had a GoodNotes notebook open and were ready to use it. The default note-taking tool was set as the pen tool—a tool designed for writing notes out by hand.
How do you write out notes by hand on a MacBook? Your main options are to click and drag on the page with an external mouse or to click and drag on the page with the MacBook's trackpad. Neither of which went very well for us.
Pulling out a graphics tablet helped us utilize the pen tool better; the other writing/drawing features, like the eraser and the highlighter, worked better too. But writing on a graphics tablet is less intuitive than writing directly on an iPad's screen.
It's also a potentially cumbersome accessory to carry around if you want to take notes during a meeting or in a class while using the Mac app.
The GoodNotes Mac app does have a text box tool that lets you type in a notebook instead of attempting to write things out by hand. But the text boxes don't have any preset margins to fit onto a GoodNotes page, and it has some odd quirks we found ourselves correcting a lot rather than focusing on note-taking.
An example quirk is if you place a text box and accidentally adjust its size while trying to move it around, the text box will not expand from the size you set while you type—this can be one-word long, but GoodNotes doesn't care.
Meanwhile, if you don't set the text box size, GoodNotes will let you type right off the page, where it becomes difficult to readjust the size.
These issues with typing notes drove home that GoodNotes really wants to be a handwritten notes app, not a typed notes one. And honestly, if we're typing out notes, we'd rather do it in Microsoft Word or the arguably superior word processor Apple Pages. Even Notes or Text Edit offer better typing options!
The main function of GoodNotes 5—being a digital notebook you can write in and more easily edit than paper—just doesn't make sense on Mac's interface. So as a Mac-only note-taking app, it's a hard pass for us.
Annotating PDFs went fairly well in GoodNotes for Mac. It was easy to upload a file, and highlighting text with a mouse or trackpad was as easy to do here as in Preview and while using the Markup feature on Mac.
Writing notes by hand onto the PDFs came with the same issues as trying to handwrite full notes, but at least the text box options were pretty easy to implement here.
The main downside to annotating a PDF was that the PDF file we uploaded took up space as a notebook in the app. Since we were using the free version of GoodNotes, we only had room for three notebooks, which meant we didn't have much room for annotating more PDFs or for note-taking.
The full version of GoodNotes is a one-time purchase of $11.99, and it comes with unlimited notebooks plus lots of folder options for organizing PDFs you highlight in the future.
But with free PDF editing options like Preview and Markup already built into Macs, it doesn't seem worth paying for GoodNotes just for that. Granted, paying for GoodNotes gets you the full app on your iPad, iPhone, and Mac. But we wouldn't recommend paying if you plan to only use the app on your Mac.
By now, we've reached the firm conclusion that GoodNotes isn't really an app designed for the Mac. It made a lot of sense for the iPad, though—would the Mac version of GoodNotes do better if used in tandem with the iPad version? Let's find out.
GoodNotes connected our MacBook and iPad via iCloud, meaning both devices were logged into the same Apple ID and on the same Wi-Fi network.
Within these conditions, we found that we could make a mark in a GoodNotes notebook on one device and see that pop up on the other in 10 to 30 seconds. The more writing or drawing, the longer the synchronization took.
This was pretty fast in our book, and it was nice to keep the same notebook open on both devices simultaneously without any issues. The notebook pages also keep the same formatting on both devices, so there aren't any surprises when you go between.
Despite this synchronicity, though, we found using the two apps at once wasn't particularly useful. In the end, it was easier just to use the iPad version for everything.
Since GoodNotes boasts that its software can read handwriting and make it searchable, something we thought could be useful was writing out notes on the iPad and then using the Mac GoodNotes app to find where in the notebook we'd written something.
This ended up not being possible. The Mac version could search typed text, but not handwritten text, whether the writing originated on the iPad or the MacBook.
GoodNotes does list on their site that handwriting recognition comes with the full version of the app, but we found the iPad version was able to do the handwriting search. So perhaps only the Mac app is limited.
We also found typing notes in the Mac app and then drawing figures with the iPad was easy enough to do, but it was also possible to type notes on the iPad and leave the Mac out entirely.
Certainly, we could imagine scenarios where we'd type a bunch of text boxes, and arrange them and make drawings and corrections later on the iPad. But for a lot of note-taking, it'd just be easier to use one device.
It also wasn't easy to rearrange drawn elements once they were on a page, no matter which device we used. Use of the selection tool moved drawn and written text alongside the typed text, so it wouldn't be easy to correct or reorganize any notes made on the iPad in the Mac version either.
Though we can definitely say that GoodNotes 5 is a great iPad app (particularly if you have an Apple Pencil), it is not an app that was in any way designed for use on a Mac. The Mac version of GoodNotes is just the iPad app slapped onto a computer. And it just doesn't work.
We're disappointed in this finding, as it's always great to find apps that work well on several devices and that can be used in tandem on those devices. But we'll comfort ourselves knowing better apps exist and continue using GoodNotes as it was intended to be—on a tablet!
Edjust recently raised their pre-seed investment round and aims to benefit the global audience through their presence.
Amidst the reported chaos and slowdown in the global 350B$ Edtech industry for students, an Indian startup ‘Edjust’ lead by 3 young founders Dushyant, Anmol and Siddhesh, seems like a perfect solution to bring the momentum back to the industry.
Edjust is the world's first user centric Edtech aggregator where users can discover just the right education courses for their young champions. Courses like Coding, Robotics, Web 3, Astronomy, Languages, etc. from among hundreds of course options offered by edtech platforms around the world.
Parents can compare these Edtechs apple to apple on different parameters, read reviews and book free demo classes with multiple Edtechs from a single place before making any decision.
This completely eliminates the hassles at the parents' and the student's end of visiting several edtech websites in their attempt to figure out the best course.
While on one hand, this results in major time saving on the part of the parents, it also helps them identify the right course for their children based on their needs.
With its tagline ‘Don't Adjust Go Edjust’, The founders of Edjust are on a mission to help 100 million parents in the next 3 years, help them make the right decision as E-learning is an inevitable part of Gen Z’s journey of becoming future leaders.
Being asked about rising negative popularity and slowdown of the edtech segment, founders quoted “The slowdown is a temporary correction and is majorly because of inflated prices of courses and the way they are pushed to the users. The inflation in prices is majorly because of the high number of edtechs targeting the same audience. With the lack of other channels to acquire new users the edtechs are bleeding on the cost of acquisition of users. Having said that, Edjust is going to help Edtechs lower their cost of acquisitions by upto 50% which inturn will lower the course prices making it affordable and set the industry in the right orbit towards growth again.”
With the Edtech problems being persistent across the International community as well, the founders eye an aggressive international expansion throughout 2023.
With its ongoing development of blockchain to create and unbiased platform and bring back trust to the industry, Edjust seems to have packed just the right recipe for a much needed growth potential in the global education sector.
Apple has arguably changed our lives more than any other company in the world during the past two decades or so. But aside from its digital devices such as iPhones, laptops, watches and operating systems, is there another direction it could go in?
The somewhat tentative answer to that has been transport, in the form of electric self-driving vehicles.
Is Apple gearing up to challenge electric vehicle market leader Tesla, and what progress has been made so far?
Here's what we know:
An Apple-branded car has been mooted for some years now, with sporadic reports of progress being made.
However, the Cupertino-based company has always remained tight-lipped about how far it has progressed and what exactly it plans to do, beyond stating it is working on self-driving systems.
“We have an incredibly talented team working on autonomous systems … some groups are being moved to projects in other parts of the company, where they will support machine learning and other initiatives across all of Apple,” the company said in 2019.
In 2021, chief executive Tim Cook told The New York Times that self-driving technology was being worked on but was reluctant to share specific details.
“In terms of the work that we are doing [in that field], obviously, I am going to be a little coy on that,” Mr Cook said.
“The autonomy itself is a core technology … If you step back, the car, in a lot of ways, is a robot. An autonomous car is a robot. And so there’s lots of things you can do with autonomy. And we will see what Apple does.”
Apple secretly started its automated and EV development — Project Titan — in 2014, and hired key executives from Tesla to propel its self-driving and EV initiatives.
Reuters reported in December 2020 that Apple was aiming to have a car on the roads in 2024, while Bloomberg last year reported a prospective date of 2025.
However, that target was said to be dependent on the company’s ability to complete the self-driving system.
Not according to a new report by Bloomberg, which says Apple has scaled back its ambitious self-driving plans and postponed the car's target launch date to 2026. The iPhone maker's shares fell 2.4 per cent after the report.
There are also going to be some design changes.
There has been no official word from Apple yet, so it is a case of using your imagination to some extent.
What is known is that Apple filed a patent with the US Patent & Trademark Office in 2017 for a “VR system for vehicles that may implement methods that address problems with vehicles in motion that may result in motion sickness for passengers”.
The patent filing said: “The VR system may provide virtual views that match visual cues with the physical motions that a passenger experiences. The VR system may provide immersive VR experiences by replacing the view of the real world with virtual environments.
“Active vehicle systems and/or vehicle control systems may be integrated with the VR system to provide physical effects with the virtual experiences. The virtual environments may be altered to accommodate a passenger upon determining that the passenger is prone to or is exhibiting signs of motion sickness.”
So, if passengers inside the vehicle are immersed in VR, is there any need for windows?
Apple's ideal car would have no steering wheel or pedals, and its interior would be designed around hands-off driving, Bloomberg reported in November.
This sounds much like the Tesla robotaxi, which is currently under development. Tesla chief Elon Musk has confirmed the lack of pedals and steering wheel for the robotaxi, which makes it sound more like a train than a car.
“It is going to be highly optimised for autonomy — meaning it will not have steering wheel or pedals. There are a number of other innovations around it that I think are quite exciting, but it is fundamentally optimised to achieve the lowest fully considered cost per mile or kilometre when counting everything,” Mr Musk said last month.
“I think [the robotaxi] really will be a massive driver of Tesla’s growth.”
Bloomberg reported that one option Apple discussed features an interior similar to the one in the Lifestyle Vehicle from Canoo. In that car, passengers sit along the sides of the vehicle and face each other like they would in a limousine.
However, Apple is now planning a less-ambitious design that will include a steering wheel and pedals and only support full autonomous capabilities on motorways, according to Bloomberg.
The company plans to develop a vehicle that lets drivers conduct other tasks on a motorway and be alerted with ample time to switch over to manual control, the report said.
Apple has reportedly hired long-time executive of Lamborghini Luigi Taraborrelli. He has been brought in to lead the design, Bloomberg reported, sources said.
He has been part of the Italian luxury sports car maker since 2001 and stepped down from his role as the head of research and development for chassis and vehicle dynamics in May, his LinkedIn profile says.
Apple's car project is currently being led by Kevin Lynch, who is also responsible for the Apple Watch and health software teams, and John Giannandrea, Apple's head of machine learning.
An annual study from Strategic Vision, which polled 200,000 new vehicle owners, found that 26 per cent said they would “definitely consider” buying an Apple car, placing it behind only Toyota and Honda.
Twenty-four per cent ticked the top box (“I love it”) when asked their impression of the quality of the brand, beating all others by a wide margin.
Meanwhile, more than 50 per cent of Tesla owners said they would definitely consider a future Apple vehicle. “Everyone should be prepared,” Strategic Vision president Alexander Edwards said.
Apple has had a fleet of 69 Lexus SUVs experimenting with its technology, California's Department of Motor Vehicles reports.
Although, as Tesla has found out, safety is a major hurdle to overcome.
Tesla also says it expects “full self-driving” beta test software to be released to all US customers who purchased the feature by the end of the year.
About 100,000 owners are testing the system now on public roads, Mr Musk said. However, it will have to get past regulators once deemed ready.
“Of any technology development I have ever been involved in, I have never really seen more false dawns, where it seems that we are going to break through, but we don't, as I have seen in full self-driving,” he said.
“To solve full self-driving, you actually have to solve real world artificial intelligence, which nobody has solved.
“I think we will achieve that this year.”
More than 1,400 self-driving cars are estimated to be in use in the US.
This has not been without controversy. Last year, a Tesla car in Houston, Texas, was reported to have driven itself into a tree, killing two people.
Tesla chief executive Elon Musk speaks during the opening day of the Tesla Gigafactory in Gruenheide, Germany. EPA© Provided by The National
— This article was first published on May 20, 2022
Apple (AAPL) could have a few other problems on its hands in 2023 besides an uncertain manufacturing situation in China, according to one Apple bull on the Street.
"I think there are two main issues lying ahead of Apple for 2023," Oppenheimer analyst Martin Yang said on Yahoo Finance Live (video above). "Number one is we've seen two years of very strong upgrade and replacement cycle for iPhone, ... and that replacement cycle strength may weaken into '23 by two factors. One is the majority of the installed base is most likely upgraded, and then continuing macroeconomic pressure will cause certain groups of consumers to delay their upgrade. Secondly, there are ongoing pressures on Apple's services software revenues."
Apple's stock is already under pressure amid the COVID situation out of a key manufacturing hub in China.
China's COVID-19 cases are surging toward record highs just as the country was moving away from its zero-COVID policy. Violent protests erupted at the flagship plant of iPhone maker Foxconn last week and have intensified across the country in latest days.
"Apple is struggling to overcome a combination of shutdowns and worker protests at a key production facility in Zhengzhou, China, that resulted in Apple negatively pre announcing on Nov. 6," EvercoreISI analyst Amit Daryanani wrote in a note on Tuesday. "Since then the situation in Zhengzhou appears to have somewhat improved but not back to normal. We think the site has been operating at ~60-70% utilization for nearly a month. To reflect the continued headwinds we are adjusting our [quarterly] estimates lower as iPhone demand could get affected by 5-8 million units (mostly at high-end) and negatively impact revenues by ~$5-8 billion in [the current quarter, which ends after December]."
Daryanani isn't alone in his near-term concern over Apple's bottom line because of China.
"It has been a gut punch at the worst time possible for Apple," Wedbush Managing Director Dan Ives said on Yahoo Finance Live. "We're talking shortages in a lot of Apple stores of upwards of 30% of iPhones in terms of the iPhone 14."
Ives estimated that Apple now has "significant" iPhone shortages that could wipe off 5% to 10% of units in the current quarter.
Brian Sozzi is an editor-at-large and anchor at Yahoo Finance. Follow Sozzi on Twitter @BrianSozzi and on LinkedIn.
Click here for the latest trending stock tickers of the Yahoo Finance platform
Click here for the latest stock market news and in-depth analysis, including events that move stocks
Read the latest financial and business news from Yahoo Finance
Download the Yahoo Finance app for Apple or Android
Follow Yahoo Finance on Twitter, Facebook, Instagram, Flipboard, LinkedIn, and YouTube
This week, Apple held a number of Today at Apple sessions across Canada, celebrating Computer Science Education Week (CSEW). Partnering with developers and coders, Apple brought in students to sit in on this free program.
The Today at Apple sessions were dubbed ‘Coding Lab for Kids: Code Your First App.’ At Toronto’s Apple Yorkdale, Micheline Khan of Althea Therapy attended to speak and inspire students from Shoreham Public Sports and Wellness Academy. Over on the west coast, Apple Pacific Centre saw Sarah Bolland of Life Lapse do the same with West Vancouver students.
iPhone in Canada was invited to attend the December 5th session at Apple Yorkdale. It was here that we got a chance to see Khan speak to roughly 25 students ranging from Grade 2 to Grade 5. During her hosting panel, Khan spoke about her own upbringing and incentives to code as a POC and become inspired to tackle problems head-on. This eventually led her to found Althea Therapy, an app centred on mental health and wellness.
Recently, Khan was recognized as the Ryerson DMZ’s Women of the Year for Women in Technology. Althea Therapy “helps you find culturally responsive mental health professionals in Canada,” Kahn tells us. “We focus on black, indigenous and racialized therapists.” During the Today at Apple session, students were encouraged to chime in with their knowledge of the importance of mental health. Additionally, the Shoreham Public School student leaders were able to share in the excitement of coding and app development.
Much to our surprise but these students no older than 10 or 11 have a very good working knowledge of how apps integrate with our daily lives. The students involved clearly has an invested interest in coding and knowledge of how Apple’s technology can make their visions a reality.
“A lot of schools, Shoreham being one of them, have taken on sort of STEM learning as part of their education within their school,” says Helen Fisher, Superintendent of Education with the Toronto District School Board, who was in attendance. “They’re looking at how do we move from that paper-pencil type of work to what are real-life problems that we could solve?”
Shoreham, for instance, offers iPads for students to use while in class. Starting from more hands-off lessons in kindergarten and working through a curriculum of coding and app development, the student’s ambitions can be enriched by technology. Plus, more importantly, mentorship largely has an impact. Shoreham Vice-Principal Patrick Sefa adds that he believes Apple’s ecosystem is primed for students to become familiar with coding platforms. “It’s definitely much more user-friendly. It’s easier for the students to learn, and it’s easier for the staff also to learn. It’s made it easy to teach and learn different applications.”
Apple’s support of CSEW is a ripe opportunity for students who may not normally gain exposure to app development and mentorship. “Even a trip down to Yorkdale Mall is an exciting adventure,” Sefa says. “Some of them do not leave the borders of their community. Having the opportunity to participate in something like this is gonna last a lifetime.”
“I think this event, in particular, is so important, because you’re working with kids, and also youth at such a young age,” Khan says. “You don’t supply them enough credit for how smart they are. Just by the questions that they were asking. The feedback was amazing.
In addition, we sat down with Bolland to discuss Life Lapse and CSEW following her Today at Apple session. Life Lapse is a stop-motion app, designed to create unique and engaging video content. With a filmmaking and video marketing background, Bolland was determined to utilize the Apple ecosystem to bring “democratized creativity” to anyone with an iPhone.
Her session was held at Apple Pacific Centre with West Vancouver students, offering mentorship. “It is really inspiring, Bolland told us. “As a business owner, sometimes you just get so focused on your own little problems and business… I loved meeting all the girls, they’re way smarter than me.”
When asked about how individuals can take the first leap into coding and app development, especially at a younger age, Bolland says, “There are so many free resources online. Even in the App Store, like Swift playground, anyone can get started and start learning. I’d say just like dive in. Don’t overanalyze or think too much, which is a blessing and a curse sometimes, Constantly learn from all the mistakes, throw spaghetti at the wall, that sort of thing.
Apple’s iPads and Macs offer support for Swift and Xcode, both of which can be utilized by teachers to inundate students in the world of coding. Apple also offers its Everyone Can Code and Develop in Swift resources. Turning attention from students to Aspiring entrepreneurs, devs can hone their skills via the Apple Developer Academy. This resource was first established in Brazil in 2013. Currently, the academy operates across 17 global locations, including Detroit; Naples, Italy; and Pohang, South Korea.