Exam Code: AD0-E116 Practice exam 2023 by Killexams.com team Adobe Experience Manager Sites Developer Expert ADOBE Experience Practice Test Killexams : ADOBE Experience VCE exam - BingNews
https://killexams.com/pass4sure/exam-detail/AD0-E116
Search resultsKillexams : ADOBE Experience VCE exam - BingNews
https://killexams.com/pass4sure/exam-detail/AD0-E116
https://killexams.com/exam_list/ADOBEKillexams : Placement Test PracticeKillexams : Placement Test PracticeSkip to Main ContentSkip to Main NavigationSkip to Footer
Being prepared is the best way to ease the stress of test taking. If you are having difficulty scheduling your Placement Test, please contact the UNG Testing Office.
Practice, practice, practice before your scheduled Placement Test.
If you have a physical or learning disability, contact Disability Services promptly to discuss whether or not you are eligible for accommodations at UNG for your Placement Test.
The night before your Placement Test, get a good night’s sleep. This is not an ideal time to stay out late with friends. Take a rain check on that.
Make certain you eat something nutritious before your test. Your brain cannot work on low fuel. Also make certain you are hydrated. Good choices are fruit, yogurt, almonds, or peanut butter crackers and water.
Make certain you have directions to the test site and that you allow sufficient time to get lost at least once, search for a parking space, use the restroom, check in with the testing administrator, and take some calming breaths before the test begins.
The testing environment tends to be a little cool, so please bring a sweater with you, just in case. Scratch paper and pencils are provided by the testing administrator. You will not be permitted to use an outside calculator or your cell phone in the testing center; lockers are provided to secure your personal items.
If you are required to take more than one subject area test, the order in which they are administered is: Writing 1st (if required); practicing 2nd (if required); and Math 3rd (if required). All of this is completed on a computer in one session.
Please read all test directions carefully. We often see students scoring low on the Placement Test simply because they rush through the directions and then don’t fully understand or follow them. The directions for each test are different, so take the time to read through each set of instructions.
The Placement Tests are not timed (with the exception of the WritePlacer). Please do not be concerned if other students are finishing before you. They may not be required to take the same number of tests as you. They also may have a different set of math problems or different practicing passages than you. Nobody is your pace car and this isn’t a race to the finish line.
Following University System of Georgia policy, UNG will use your Next Generation Accuplacer scores to determine placement into or out of Learning Support. Students who score below 243 on the practicing test (scored on a 200-300 point scale) and/or below 4 on the WritePlacer (scored on a 0-8 point scale) will have a Learning Support English requirement at UNG. Students who score below 258 on the Quantitative Reasoning, Algebra, and Statistics (QRAS) test (scored on a 200-300 point scale) will have a Learning Support math requirement at UNG. Students scoring between 258 and 265 will have a Learning Support math requirement at UNG if their major requires College Algebra, MATH 1111, either as a core requirement or as a pre-requisite for a core math requirement. Your scores do not determine admissibility but, rather, determine placement. For more information about Learning Support you can read about it on the Learning Support Website.
Upon receiving an Admissions decision on your Check Application Status site, please contact Dede deLaughter (by email or by phone at 706-310-6207)with any questions you may have about your test results and/or any retesting options.
If you do place into Learning Support, please understand that this is not punitive but, rather, in your best interest. UNG strives to support you in achieving your academic goals, and that includes starting you in the appropriate level of instruction that sets you up for academic success.
If you have a red yes in any Placement Test Required row on your Check Application Status page in Banner, read the information below relating to the area in which you have the red yes.
Click on the Register NEW Account button. Look on your Check Application Status page for the School Number and School Key. After you register, you will be issued a username and password. SAVE this information for future log-in access!
Don’t rush. You will have one hour to write your essay. Your essay will be evaluated on organization, focus, development and support, sentence structure, and mechanical conventions.
Use the provided scratch paper to do your brainstorming.
Using your brainstorming ideas, write your essay.
BEFORE YOU SUBMIT your essay, PROOFREAD!
Proofreading tips:
Make sure your essay actually answers the prompt.
If you used examples, make certain they are clear and concise.
Avoid using impressive-sounding vocabulary unless you are absolutely certain you are using those words correctly.
Re-read every paragraph carefully. You know what you meant to say. Will the reader know what you meant to say?
Re-read your entire essay from the bottom to the top, one sentence at a time. This is a good way to catch grammar errors, like sentence fragments, subject-verb agreement errors, punctuation errors, and the like.
Be sure to review the information in the preparation modules before and between practice exams to Excellerate your score!
Practice the Test
Visit the free Longsdale Publishing Accuplacer practice site. Then click on the Register NEW Account button. Look on your Check Application Status page for the School Number and School Key. After you register, you will be issued a username and password. SAVE this information for future log-in access!
Select the Quantitative Reasoning, Algebra, and Statistics (QRAS) test.
Be sure to review the information in the preparation modules before and between practice exams to Excellerate your score!
Take your time. Answering the math problems as accurately as possible could save you time and money in the future if you are not required to take an extra math class.
Check your answers. Remember, in a multiple choice test, there are usually at least two wrong answers that are based on the most common miscalculations or process errors.
Calculators and scrap paper. Certain placement test problems will allow you to use the built-in calculator. If not, use the scratch paper that will be provided. (This scratch paper will be collected before you leave the test, so don’t make your to-do list on it.)
Practice the Test
Visit the free Longsdale Publishing Accuplacer practice site. To set up your free account, first obtain the UNG School Number and School Key by emailing learningsupport@ung.edu - preferably from your UNG email account, providing your student ID (900) number. Then click on the Register New Account button. After you register, you will be issued a username and password. Save this information for future log-in access!
Select the appropriate Eligibility Exam: QRAS if you are attempting eligibility to take College Algebra, MATH 1111; Advanced Algebra and Functions if you are eligible to take MATH 1111 and are attempting eligibility to take Pre-calculus, MATH 1113, Brief Calculus, MATH 2040, or Calculus I, MATH 1450.
Be sure to review the information in the preparation modules before and between practice exams to Excellerate your score!
Check your answers. Remember, in a multiple choice test, there are usually at least two wrong answers that are based on the most common miscalculations or process errors.
Calculators and scrap paper: certain placement test problems will allow you to use the built-in calculator. If not, use the scratch paper that will be provided. (This scratch paper will be collected before you leave the test, so don’t make your to-do list on it.)
Establishing Connection...
Wed, 13 Jul 2022 09:51:00 -0500entext/htmlhttps://ung.edu/learning-support/placement-test-practice.phpKillexams : Save the Children experiences donor growth with AdobeLinda McBain
When Russia invaded Ukraine last February, there was a race against the clock to get aid to the country. UK charity Save the Children was able to raise £1.5 million for the Disaster Emergency Committee’s Ukraine Appeal in just two weeks – enough to provide around 150,000 food baskets, 220,000 family hygiene packs and 90,000 emergency first aid kits to those most in need.
If the invasion had taken place some years earlier, the charity’s contribution may not have been able to contribute so much. Back in late 2017, Save the Children rolled out Adobe Experience Cloud as the platform for all its digital content creation, marketing and donations. So when the Ukraine invasion happened, the charity was able to provide personalized homepages using Adobe Target - something which previously would not have been possible and which led to extra donations. Linda McBain, Chief Digital Officer at Save the Children, explains:
Emergencies are our bread and butter so whatever technology we're on, we always need to ensure it does that. What we saw with Ukraine was, we had a higher volume of returning donors to the site. We don't usually see that with emergencies, people often just provide a one-off donation. Because we saw that was happening, we put out a test and offered a different experience if you'd already donated to show the impact of your donation and tell that story so it's slightly different. That increased donation revenue from second donors.
Save the Children began the project to update its web platform in mid-2016, prompted by a change in supporter behavior. People no longer wanted to just interact with the charity over the telephone and by post, while online donations were increasing. However, the experience for online donors was often sub-par and donations weren’t supported on all devices.
The charity knew it had to increase and Excellerate the online experience, and set out to develop a digital strategy to recruit and engage more supporters with a best-in-class digital experience. McBain says:
Our technology set-up was quite lo-fi, I often say everything was held together by elastic bands and Sellotape. We knew that interest in Save the Children through digital channels was growing. We wanted to better be able to harness data to provide more relevant and personalized experiences so we could increase long-term support. That wouldn't have been possible with the set-up that we had before we made that decision.
Experience
The organization went out to market and did a full assessment of the options available at the time. Adobe stood out in terms of its ability to deliver on Save the Children’s ambitions, but also as it could integrate across multiple products. The charity was already using Adobe Campaign Manager and had a good experience with that. The new technology platform needed to work well with that technology, because of the element of data personalization Save the Children wanted to provide to supporters.
The Experience Cloud implementation took around 18 months from initial business case to being up and running. Previously, Save the Children didn't have an equivalent to the Adobe platform. It was using the free version of Google Analytics, and the open-source Drupal content management system. McBain adds:
I'm not critical of any of those technologies, but the way we had them set up was not right. It was either a rebuild completely of all of that and a consideration of how we better utilize it, or pay for paid versions, or a move to Adobe. All these technologies are very similar in many ways, but it's about what you want to try and harness or how easily they integrate for the customers.
Save the Children decided to go for the latter option, with an Adobe one-stop-shop for all its digital content, marketing and donation needs rather than piecemeal software. The aim was to build a strong support base for Save the Children by bringing in new supporters and further engaging current support, encouraging donations, and being able to take those donations, upgrade them or get people to donate regularly. McBain says:
Ultimately we want someone to take out a monthly gift with us. That's always our ultimate ambition to have that financial knowledge so that we can help children as best we can.
Digital donors
With the new technology platform in place, the charity was ready for the ensuing push to digital as a result of the COVID pandemic. McBain recalls:
What we saw overnight was a shift in demand to have much more digital experiences, even from those supporters who might have previously been uncomfortable about that. Where 45% of our total mass income was coming in digitally pre-pandemic, now it's at 60-66%. That shift is not abating, in fact it's increased.
Save the Children’s website is now fully optimized for whatever device a user has, with support for Apple Pay, PayPal and SMS donations for both regular giving and single cash donations, as well as via sites like Facebook or Instagram.
After the Adobe rollout, the number of website visitors converted into donors jumped 85%, while those signing up to become regular donors increased by 58%. By using Adobe Target more effectively to do more user testing, Save the Children has also seen incremental growth of £500,000.
As part of its move to digital, Save the Children has invested not only in the technology but also the people aspects. Before the Adobe project, there was just one person managing, maintaining and editing its website from a content perspective; it now has an eight-person UX and content team. McBain says:
It's a big shift. We've gone on a journey beyond technology here. This is about shaping new skills across multiple teams. Our marketers are all expected to be digital-first marketers, we don't call them digital marketers.
Delivery of the program required involving a lot of different people. Save the Children has a model of devolved content creation, which meant any team who manages pages on the site needed retraining to understand how they could edit content on the new platform. McBain notes:
Our CEO's office might want to update a report, our policy and advocacy colleagues want to post content or our media team want to put the press releases out. It really did touch so many areas of the organization and even if they weren't content editing, we needed to sense check all the new content before we ported it over onto the new site.
As well as having digital champions for each team to keep people updated on the project, and a change manager to coordinate all of that, McBain’s team worked closely with internal comms to ensure a smooth rollout:
I ran a series of lunchtime talks every month through the program and brought in experts from loads of different tech companies to also spark inspiration, getting Google in to talk about user research, and design or brand marketing expertise from people like Innocent.
It was really trying to get people up front to be thinking about what it would mean when the new technology landed so that it wasn't like, 'These things turned up like a space shuttle, what do we do with it now?'
Creating a buzz
The project team also sat in a central location in the building, so other staff could see the team and their kanban board on the wall, and create a buzz so people would be more interested. However, this has got McBain thinking how they would replicate that in the new era of virtual and hybrid working
Nothing has that same level of visibility anymore. That's my experience in the change with remote work. It'll be much harder now than it was being in the building and going around talking to people all the time about it.
Save the Children has been using Experience Cloud for around five years now, and has still got the same kit it originally deployed as it works really well for the organization. While the charity is maturing its use of the tech, one thing it would like to see additional support for is building more experts, according to McBain:
With the move to low and no code, how does Adobe get its MarTech stack to be as simple and usable as possible, with ease of use for as many different staff roles as possible? It's all about simplifying and meaning you can empower staff to do their own work easily without loads of training.
Mon, 30 Jan 2023 22:38:00 -0600BRAINSUMentext/htmlhttps://diginomica.com/save-children-experiences-donor-growth-adobeKillexams : The Best Web Conferencing Software for Remote Workers
Sustainability is becoming a more important concern for businesses than ever. A KPMG report from 2020 showed that 80% of top companies have sustainability reports and a growing number of small businesses are following suit. There is a clear economic reason for this since 92% of consumers state that they are more willing to trust environmentally friendly businesses.
But how do you actually run a green business?
Some of the most important steps you can take involve procuring eco-friendly products and materials and investing in renewable energy.
Keep in mind there is going to be a learning curve. One of the most important things that you need to do is make sure that you have the right software to manage it more easily.
If you frequently conduct online meetings or video calls, you know how important it is to have a reliable and user-friendly tool. But with so many options, it can be overwhelming to figure out the best fit for your needs.
Here are some of the essential features you may need to learn about regarding web conferencing software. Whether you're a green business owner, a remote worker, or just someone who needs to stay connected with colleagues and clients, these features can make a huge difference in your online communication experience.
Screen sharing is essential to any good web conferencing software. It allows you to share your screen with others during a meeting or video call, which can be incredibly useful for various purposes.
Also, screen sharing can be helpful for collaboration purposes. If you're working on a project with a team, you can share your screen to show them your work and get their feedback. This can save much time and hassle compared to explaining things over the phone or messaging.
Many web conferencing software allows multiple users to share their screens simultaneously, which can be especially useful for team brainstorming sessions or group problem-solving. Overall, screen sharing is an essential feature that can significantly enhance the efficiency and effectiveness of your online meetings and video calls.
On-Demand Webcasting
On-demand webcasting is another helpful feature to look for in web conferencing software. This allows you to record your online meetings or video calls and make them available for viewing at a later time.
This can be incredibly helpful if you have team members or clients who cannot attend the meeting in real time. They can watch the recording at a convenient time and catch up on any critical information or discussions that took place.
On-demand webcasting can also be helpful for training purposes. If you have a new employee or team member who needs to learn about a specific topic, you can record a meeting or training session and share the recording with them.
This can be much more efficient than scheduling multiple training sessions or explaining things over the phone or through messaging.
Simulations and games
Simulations and games can be a fun and engaging way to make your online meetings and video calls more interactive and dynamic. Many web conferencing software now offer built-in tools for creating and hosting simulations and games during your sessions.
For example, you might use a simulation to role-play a particular scenario or practice a new process. This can be a great way to help your team learn and prepare for real-world situations.
Games can also be a great way to break up the monotony of a long meeting and add a bit of fun and friendly competition. Use a trivia game to test your team's knowledge on a particular syllabu or a more interactive game like a virtual escape room to challenge your team's problem-solving skills.
Simulations and games can add excitement and engagement to your online meetings and video calls. They can also be a helpful tool for training and development purposes, as they allow your team to practice and learn in a more interactive and immersive way.
Whiteboard
A whiteboard is another helpful feature to look for in web conferencing software. This allows you to create and share a virtual whiteboard during your online meetings or video calls, which can be incredibly useful for various purposes.
For example, use a whiteboard to brainstorm ideas, sketch out diagrams or flowcharts, or jot down notes. This can be especially helpful if you're working with a team and must collaborate on a project or problem-solve together.
Web conferencing software like Adobe Connect offers a variety of whiteboard tools, such as adding text, shapes, and images and saving and sharing your whiteboard with others. This can be a great way to keep your team organized and on the same page, even if you're in a different location.
These are just a few of the essential features you should be aware of when it comes to getting web conferencing software as a green business. A reliable and user-friendly tool can make a big difference in your online communication experience, so finding one with the right features is critical. This will help you have a work from home program that is both better for the planet and more productive.
Fri, 10 Feb 2023 03:14:00 -0600entext/htmlhttps://www.entrepreneur.com/green-entrepreneur/the-best-web-conferencing-software-for-remote-workers/444724Killexams : Microsoft Edge will soon support Adobe Acrobat PDF technology
Adobe Acrobat PDF capabilities are coming to Microsoft Edge.
Microsoft Edge's built-in PDF reader will be powered by Adobe Acrobat tech, which should deliver more accurate colors and improved performance.
Narration, improved text selection, and improved security will also be available within Edge's PDF reader.
Purchasing an Acrobat subscription will provide access to more features within Edge through a browser extension.
Microsoft and Adobe just announced that the Adobe Acrobat PDF engine will come to Microsoft Edge later this year. Beginning in March 2023, Edge's built-in PDF reader will use the Adobe Acrobat PDF engine. Microsoft promised the shift would result in higher fidelity, more accurate colors and graphics, and improved security in its announcement post. The company also highlighted that the switch would bring better security and greater accessibility.
“PDF is essential for modern business, accelerating productivity in a world where automation and collaboration are more critical than ever,” said Adobe SVP and GM Ashley Still. “By bringing the global standard in PDF experience to Microsoft Edge and the billion-plus Windows users worldwide, Adobe and Microsoft are using our joint heritage and expertise in productivity to take an important step forward in making modern, secure, and connected work and life a reality.”
Microsoft promises that the features of the new Adobe-powered PDF experience will be the same as the current offering. Additional capabilities, such as editing text and images, and converting PDFs to other file formats, will be available to those that sign up for an Acrobat subscription.
The new PDF experience will be available in Edge on Windows 11 and Windows 10.
Microsoft will shift to the Adobe Acrobat engine in phase. The move is scheduled to occur in March 2023. The legacy engine will remain available until March 31, 2024. Switching to the Adobe Acrobat engine will be opt-in for organizations at first. General users will not have the option to revert to the legacy PDF engine after the Adobe Acrobat engine launches. Microsoft has an FAQ section plus other information in a Tech Community post.
Microsoft Edge is the default browser on Windows. It's based on Chromium, so it's compatible with the vast majority of the web. There are several Insider versions of the browser, allowing you to test new features and provide feedback to Microsoft.
I recently published my review of the MacBook Pro 16 with M2 Max, which Apple released at the beginning of 2023. Now I’ve finally gotten my hands on the less powerful (and less expensive) MacBook Pro 16 powered by Apple’s M2 Pro chip. I’ve run the various benchmarks and done the various tasks, and I’ve even (finally) run the battery into the ground. Today, I’m here to talk to you about how these chips stack up to each other to help you decide which MacBook Pro model is right for you.
Behold, the benchmarks (scroll to the right to see more):
M2 Pro vs. M2 Max: price
Sixteen-inch M2 Pro models start at $2,499, while M2 Max models start at $3,099. The base M2 Pro model has 16GB of memory and 512TB of storage, while the base M2 Max has 32GB of memory. Bumping that M2 Pro model up to 32GB of memory puts it at $2,899, giving the M2 Max about a $200 premium over the M2 Pro for identical specs. That’s a consistent differential regardless of how you tweak the specs.
If you want to jump up to the M2 Max with 38 GPU cores (which is the one I tested), however, that’s going to be an additional $200 premium. This is the only CPU that can be paired with 96GB of RAM — the 30-core M2 Max model maxes out at 64GB.
M2 Pro vs. M2 Max: Battery Life
I averaged close to 14 hours of nonstop use on the M2 Max model with my workload. I can’t ensure everyone will get that amount of time, especially since my workload largely involves Chrome tabs, Google docs, and some streaming, but I do think most people will get more out of the M2 Max than they would out of the vast majority of 16-inch laptops on the market.
I averaged closer to 17 nonstop hours on the M2 Pro model. In practice, this means I can use it on battery for two to three days at a time, plugging it in like every so often. While the M2 chips don’t have as massive of a battery life delta that their M1 predecessors did, the M2 Pro — for my personal workload — had around 20 percent more juice to a charge than the M2 Max.
For me, this battery life differential wouldn’t be a dealbreaker. At a certain point, long battery is long battery. I generally use a MacBook Air as my personal computer, which averages me around 13 hours between charges, and I don’t feel that I need more. That said, if you’re really seeking a device that never dies, the M2 Pro may be slightly more attractive to you. For everyone else, it should be another part of the calculus: in addition to paying extra dollars for the M2 Max over the M2 Pro, you’re paying some hours of battery life.
M2 Pro vs. M2 Max: CPU
The difference in CPU power between the M2 Pro and M2 Max depends on which model you go for. All 16-inch M2 Pro models have 12 CPU cores. If you go for a 14-inch MacBook Pro, you can get an M2 Pro with 10 CPU cores, but the 16-inch models all have 12 CPU cores, regardless of whether you go for the Pro or the Max.
We had the 16-inch model in both cases, so both of these devices had 12-core CPUs. These chips have the same architecture — so if you’re doing primarily CPU-heavy work during your day, you’ll get very little benefit from the Max over the Pro.
Don’t believe me? Check out the Cinebench and Geekbench results in the chart above, which are quite close together (and identical in some cases).
Now, there’s one caveat here. The M2 Pro models max out at 32GB of memory, whereas the M2 Max can handle as much as 96GB. Folks who need to spend more on a 64GB or 96GB model probably know who you are: people who work with big datasets, for example, and are consistently handling large amounts of information may have a better time with one of those (very expensive) M2 Max models.
M2 Pro vs. M2 Max: GPU
The more significant difference between these two chips lies in GPU power. You can basically think of the M2 Max as the M2 Pro with some extra graphic chops.
All 16-inch M2 Pro units have 19 GPU cores, while M2 Max buyers have a choice of 30 or 38 cores. (Our test unit has 38.)
In Geekbench Compute, which tests GPU power, we can see over a 55 percent increase in graphics power, which tracks with the close to 50 percent increase in GPU core count. On Shadow of the Tomb Raider, a GPU-heavy game benchmark, the M2 Max did close to 80 percent better than the M2 Pro. That’s a very visible difference on the 120Hz screens these MacBooks have.
There’s one other major difference between these chips that only some of you will care about, which is that the M2 Max has two video encode engines and two ProRes engines, while the M2 Pro only has one of each. This means that video editors specifically can expect faster encoding and playback speeds from the M2 Max. (Exactly how much faster can be inconsistent — more on that in a minute.)
So those are the raw numbers. How does this all shake out during a workday?
M2 Pro vs. M2 Max: real-world tasks
The M2 Max is noticeably faster in video work. It beat the M2 Pro in PugetBench for Premiere Pro. My experience using it to edit was incredibly speedy. It flew through playback and exports. If I were a video editor (assuming my employer was footing the bill), I’d absolutely want one of these things.
The M2 Pro was not quite as consistent. I exported the same five-minute 4K video a number of times on it and could see a time anywhere from around the two-and-a-half minute mark to over six minutes. The process did not get consistently faster or consistently slower as I went on testing; there was no rhyme or reason for the varying times. What I did notice was that the M2 Pro often got hung up on graphics that the M2 Max was able to breeze right through. Becca Farsace, our senior video producer, had the same experience exporting a different video file.
With that said, the M2 Pro is hardly a slow chip. Adobe Premiere Pro work on it was still quite smooth. The M2 Max was both a bit louder and hotter than the M2 Pro during Premiere work (it has a High Power Mode specifically meant to maximize performance during sustained workloads, which the M2 Pro doesn’t have). Becca’s recommendation in our latest guide to the M2 line was that folks who work with 3D, animation, 8K content, and 4K content over an hour long should seriously consider the M2 Max model. The M2 Pro should be fine for everyone else (unless money is really no object).
The difference was not as stark in Xcode performance. The M2 Max completed the Xcode Benchmark just over three seconds faster than the M2 Pro. That’s a difference, sure, but it’s not nearly as wide as, say, the gap between the M2 Pro and the M1 Max. I would not expect the M2 Max to be a necessity for lighter tasks like web design.
I know there are many, many tasks people might want to use these MacBooks for that are not covered here. In general, the best way to approach this decision is to figure out: A. how much of your workload leverages heavy graphics and B. whether you need to get that work done with every ounce of speed you can get.
Oh, and I don’t imagine that any of you were considering this, but just in case you were — no, it does not make a ton of sense to upgrade to these M2 models from the M1 line. The M1 and M2 MacBook Pros were expensive and fast machines, and they should still have quite a bit of life in them. But if you are still on an Intel MacBook Pro and didn’t spring for the M1 series last time around, either the M2 Pro or Max MacBook Pro will provide a considerable jump up in performance, thermals, and battery life.
Wed, 15 Feb 2023 11:00:00 -0600en-UStext/htmlhttps://www.msn.com/en-us/lifestyle/shopping/what-s-the-difference-between-the-m2-pro-and-the-m2-max/ar-AA17wOjSKillexams : How We Test SSDs
Upgrading your desktop or laptop to a solid-state storage solution—whether that's a traditional 2.5-inch drive or a cutting-edge M.2 one—is a quick, often inexpensive way of adding some much-needed performance to an aging system. By installing a solid-state drive (SSD) in your desktop or laptop, you can drastically reduce the amount of time files, applications, and even operating systems take to load, install, or copy versus older platter-based hard drives. As long as you have the slots, ports, or bays necessary, the amount of movies, photos, and games you can shuttle onto or off of one machine is almost limitless.
To make sure you always get the best bang for your storage buck, we here at PC Labs have developed an exhaustive testing suite. A mix of industry-standard tests, "trace-based" measures (more on what that means in a moment), and home-cooked trials, it runs each drive we review through a series of real-world and synthetic scenarios to help us determine which drives are the fastest, which are the slowest, and who falls in between.
Mind you, with SSDs, speed isn't everything. We also evaluate drives on the basis of value for money and additional features, such as warranty, durability ratings, and supplementary software. But SSDs have become so good these days that sometimes it's subtle things that separate an average drive from a winner.
The Testbeds: The Systems We Rely On
Depending on the bus architecture (PCI Express vs. SATA) and connection protocol (M.2 or 2.5-inch for internal SSDs; USB or Thunderbolt for external SSDs), we test any drive that comes through the labs on a certain single testbed, or pair of testbeds, among three testbed systems.
PCIe 3.0-Based M.2 Internal SSDs; Serial ATA 2.5-Inch or M.2 Drives
These drives are tested on our main Windows-based storage testbed. This is a resolutely high-end PC circa 2020. It is equipped with an Asus Prime X299 Deluxe motherboard with an Intel Core i9-10980XE processor clocked for a max boost frequency of 4.6GHz. We use 16GB of DDR4 Corsair Dominator RAM clocked to 3,600MHz, and the system is using an Nvidia GeForce RTX discrete graphics card to power video. This PC represents a state-of-the-art high-end desktop configuration, with an SSD boot drive as the primary drive and the drive being tested configured as supplemental storage.
(Credit: Kyle Cobian)
M.2 drives on this system are installed in a secondary M.2 slot below the video card and configured as secondary storage. (The X299 motherboard we use supports both PCI Express M.2 and SATA M.2 drives.) Traditional 2.5-inch SSDs are installed on the first SATA port powered by the motherboard's main SATA controller, and installed in a 2.5-inch bay.
PCIe 4.0-Based Internal SSDs
PCI Express 4.0 M.2 SSDs offer higher potential sequential-throughput speeds than PCI Express 3.0 ones. PCI Express 4.0 support is generally available only on late model AMD-based systems from the X570/B550/TRX40 chipset period onward (using Ryzen 3000 series CPUs or later), and Intel systems with Z490 or more latest motherboards (using 11th Generation "Rocket Lake" CPUs or later). All PCIe 4.0 SSDs are M.2 drives. You can use a PCI Express 4.0 SSD in a 3.0-only motherboard, but it will bounce down to 3.0 speeds.
(Credit: Kyle Cobian)
As a result, to test the speed potential of these drives, we needed a different testbed from our X299 build. This more latest testbed uses an MSI Godlike X570 motherboard with an AMD Ryzen 9 3950X CPU installed. We use the same 16GB of DDR4 Corsair Dominator RAM clocked to 3,600MHz, and the system employs an Nvidia GeForce RTX discrete graphics card, too.
PCI Express 4.0 speedsters tend to generate a lot of heat. If a drive has its own heatsink, we test it with the sink in place. If it lacks a heatsink, or just has a basic heat spreader, we test it using the testbed motherboard's own heat sink.
As of early 2023, companies are showing off their first consumer-level SSDs that support the latest standard, PCI Express 5.0—which offers maximum theoretical sequential read speeds in excess of 15,000MBps, about double the PCI Express 4.0 max. We will be upgrading our testbed and protocols to support the testing of PCI Express 5.0 SSDs in the near future.
External SSDs
We use two testbeds here. The first is the same system as our PCI Express 3.0 testbed (Asus Prime X299 Deluxe motherboard, Intel Core i9-10980XE processor, 16GB of DDR4 Corsair RAM, Nvidia GeForce RTX card). Depending on whether an external drive meets the near-ubiquitous USB 3.2 Gen 2 standard or supports the high-speed USB 3.2 Gen 2x2 version of USB, it is tested either attached to this motherboard's sole USB 3.2 Gen 2 USB Type-C port (a 10Gbps port) or to the 20GBps USB-C port on a USB 3.2 Gen 2x2 expansion card, made by Orico, that is attached to our main storage testbed.
(Credit: Molly Flores)
After we've run the tests defined below for external drives, we then format the drive to exFAT and run a couple of supplemental tests on a 2016 Apple MacBook Pro, testing over Thunderbolt 3 (if applicable) or USB Type-C. If the drive is a Thunderbolt 3-only drive, we run just the MacBook-based tests.
The Benchmarks: Internal SSDs
Here is a breakdown of the benchmark set we run on internal drives, whether M.2 "gumstick" drives or conventional 2.5-inch SATA internals. The drives are secure-erased between each run of the different tests.
PCMark 10 Storage
The main PCMark 10(Opens in a new window) Storage test from UL is an invaluable cutting-edge measure, providing a high-level view of how the drive will function under various everyday workloads, such as word processing and videoconferencing.
(Credit: UL)
For internal SSDs, we first run the drives through the PCMark 10 Full System Drive benchmark, which simulates 23 different "traces" (simulated tasks) in the course of the run. The traces flex the drive in ways that approximate launching Adobe-based creative programs, booting up Windows 10, copying files, launching popular games, and more.
(Credit: UL)
The overall score that PCMark 10 reports back represents how well a drive does throughout the entire PCMark 10 run. This score is the sanctioned score presented by UL's software at the end of each run. This score reflects a weighted average of the various activities that the PCMark 10 storage test simulates, a general indicator of how consistently a drive can perform through the 23 different usage scenarios.
It's a proprietary number, though, and is meaningful only when compared with scores of other, competing drives. That is where our reviews come in.
Getting Granular: Booting Windows 10 (PCMark 10 Trace)
We also dig into the more granular trace data that PCMark 10 presents. The first part of it we report is culled from the Windows 10 boot trace, which simulates a full operating-system startup procedure. The throughput number we report reflects how quickly the drive is able to feed the data required for that task set.
(Credit. Microsoft)
This and the following three PCMark 10-derived, trace-based tests represent a simulation of how quickly a drive is capable of feeding data when launching a particular program, copying files, or, in this case, booting Windows 10. PCMark 10 records how many megabytes per second the drive is practicing what are known as "shallow-queue 4K random" blocks of data (i.e., of the kind in which most applications, games, or operating systems are stored). While UL recommends using the overall "read/write MBps bandwidth" metric in these tests, we dug a bit deeper to include only random 4K bandwidth in order to paint what we believe is a more specific picture of how well a drive can perform in these tasks.
Game Launching Tests (PCMark 10 Trace)
Next we report data from PCMark 10's traces around game launching. This again reflects how quickly a drive can read shallow-depth small random 4K packages. Note that the "4K" we're talking about here is file-block size(Opens in a new window), not file size; 4K is one of the more commonly used file-block sizes for game installations, though that composition does depend on the title you're playing.
(Credit: Activision Publishing Inc.)
While the three games tested in PCMark 10 are stored primarily in small random 4K, tests from around the web have shown that MMORPGs can more often use the 16K block size, and some games in other genres may tend to employ larger block sizes, from 32K up to 128K. However, for the sake of these tests, 4K small random read is the most accurate block-size metric relevant to these three popular FPS titles: Battlefield 5, Overwatch, and Call of Duty: Black Ops 4. We again report the read throughput for this kind of file.
Adobe Launching Tests (PCMark 10 Trace)
Next is the set of results based on traces simulating Adobe-application launches. As anyone who works regularly in programs like Adobe Premiere or Photoshop can tell you, a constant pinch point is the time it takes for these programs to launch.
(Credit: Adobe)
Mind you, our results don't tell the whole story of how a drive will perform for all creative applications. Depending on the complexity of your work and the number of elements in a scene, your software may have to load 3D models, sound files, physics elements, and more; in other words, more than just the program. Still, this is nonetheless interesting fodder for folks who live and breathe these Adobe apps.
Recommended by Our Editors
Copy Tests (PCMark 10 Trace)
Finally in PCMark 10, we report on PCMark 10's traces that simulate file-copy actions. While at first these numbers might look low compared to the straight sequential-throughput numbers achieved in benchmarks like Crystal DiskMark and AS-SSD (below), that's due to the way this score is calculated and the nature of and differences between the source data. If you're regularly moving files around on your drive from one folder to another, this test is a handy relative throughput measure.
Crystal DiskMark 6
Beyond PCMark 10, we also use the venerable Crystal DiskMark(Opens in a new window) utility for a second opinion on throughput. Crystal DiskMark's sequential-read tests measure read/write activity with data written in a large contiguous block on the drive, which is similar to how manufacturers themselves test drives to advertise their peak performance. These tests represent a "best case," straight-line scenario for file transfers.
(Credit: Crystal Dew World)
We also use Crystal DiskMark's 4K tests to measure random reads/writes, which reflect data activity in which the drive is fetching and writing scattered files and pieces of files across the drive. This is mostly just used as a reality check against the wealth of 4K read data culled from PCMark 10's traces.
3DMark Storage Gaming Benchmark
Gamers have long relied on 3DMark testing to benchmark their CPUs and GPUs. 3DMark Storage, introduced in late 2021, takes SSDs through their paces in performing a variety of gaming-related functions. It produces an aggregate score, combining traces of tasks from some popular AAA games. These include loading Battlefield V, Call of Duty: Black Ops 4, and Overwatch; recording a 1080p gameplay video at 60fps while playing Overwatch; installing The Outer Worlds from the Epic Games Launcher; saving game progress in The Outer Worlds; and copying the Steam folder for Counter-Strike: Global Offensive.
(Credit: UL)
The Benchmarks: External SSDs
As noted, in testing we attach external SSDs to the native USB 3.2 Gen 2 port on our main Windows 10 testbed, and afterward (if relevant) to a Thunderbolt 3/USB Type-C port on our test MacBook Pro. With the Windows machine, we'll cite if a drive supports Gen 2x2 speeds and is attached instead to the expansion-card USB 3.2 Gen 2x2 port.
PCMark 10 Data Drive Benchmark
We're not done with PCMark 10 quite yet! The Data Drive Benchmark is a solid test to run on any drive you intend to use as a data archive or a backup drive, and typically takes between 10 and 30 minutes to run, depending on the drive type and its connection standard.
(Credit: UL)
Like the PCMark 10 Storage test, it runs through a host of trace-based activities to simulate typical daily drive activities for a secondary drive. The proprietary number it reports back is useful only when compared against other drives' PCMark 10 results.
Crystal DiskMark 6
For external SSDs, we run the Crystal DiskMark 6 test under the same parameters as for internal drives above (sequential read/write, and 4K read/write).
Blackmagic Disk Speed Test
With this and our next test, we move the drive, if compatible, over to our Apple MacBook Pro tester platform and reformat it into exFAT. We use the macOS-only Blackmagic Disk Speed Test(Opens in a new window) app from professional media firm Blackmagic Design (the makers of DaVinci Resolve) to perform this test. It reports back a drive's throughput in bits per second. This utility is typically used to discern whether a given drive has enough throughput to play back specific video formats smoothly. But it also returns some useful throughput measurements.
(Credit: Blackmagic Design)
Blackmagic offers both a read score and a write score, which we compare with those of other, similar drives. These scores are useful in discovering the theoretical maximum speed that a drive can achieve.
Folder Transfer Test
The final test for external drives is a drag-and-drop test, also performed on our MacBook Pro. It uses the macOS Finder to copy a 1.23GB test folder full of several different file types from the testbed's internal drive to the external SSD being tested. We hand-time the scores (in seconds).
Get Our Best Stories!
Sign up for What's New Now to get our top stories delivered to your inbox every morning.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Mon, 17 Aug 2020 13:19:00 -0500entext/htmlhttps://www.pcmag.com/about/how-we-test-ssdsKillexams : Microsoft Surface Laptop 5 review: If it ain't broke...No result found, try new keyword!We wanted to find out exactly how much better the Laptop 5 really is, so we've been putting it to the test. Here's what we found out ... as well as longer battery life. In practice, we didn't find the ...Mon, 06 Feb 2023 18:00:00 -0600https://www.pocket-lint.com/microsoft-surface-laptop-5-review/Killexams : Best Adobe After Effects alternatives 2023: free and paid VFX software
The best Adobe After Effects alternatives offer powerful visual effects software - whatever your experience and budget.
Long-considered one of the best VFX software solutions on the market, Adobe After Effects delivers Hollywood-grade special effects, motion graphics, and video compositing.
In our review, we found the latest version offers a "wealth of new features designed to make VFX quicker and easier. It’s no wonder After Effects is seen as the go-to video compositor.” For professionals deep within the ecosphere, it also offers seamless integration with Adobe Premiere Pro, our pick for best video editing software on the market. Together, they help add professional-grade post-production polish to any video project.
All that can make it difficult to identify the best alternatives to Adobe After Effects. But the special effects software isn’t without its faults. Some users find the interface tricky to use as a beginner, while others dislike the subscription-only pricing plans.
If you want to swerve the Creative Cloud subscription, or just want to master a new bit of kit, we’ve tested, reviewed, and rated the best Adobe After Effects alternatives for beginners, post-production pros, and creatives on a budget.
Today's best Adobe After Effects deals
The best alternatives to Adobe After Effects
(Image credit: FXhome)
Best After Effects alternative for pros
Reasons to buy
+
Editing and VFX in one streamlined app
+
Biggest and best feature set
+
Over 875 VFX presets
Reasons to avoid
-
Expensive one-off price
HitFilm Pro(opens in new tab) offers the ability to edit videos and add compositions in a single tool. You won’t need to switch programs, like you do between AE and Premiere Pro, leading to smoother, more efficient workflows.
Since you’re working within one app, effects are applied onto a non-linear editing timeline, compared to AE’s layer-based system. This makes it easier to learn for those already used to non-linear editing.
And the two-in-one nature of HitFilm Pro doesn’t mean that the effects features are any less deep. When it comes to VFX toolkits, this is a capable rival to After Effects.
HitFilm Pro’s developers claim it has “the world’s biggest VFX software toolkit”. It lives up to that boast, making it one of the best Adobe After Effects alternatives. Some of the best tools include advanced 2D and 3D motion tracking, world-leading particle effects, an excellent chroma key, and over 875 VFX presets.
Price of admission is admittedly high. The $349 / £308.86 price tag likely to sway casual and rookie effects artists. That said, this is a one-off fee rather than a subscription. There’s also a free demo version available that has all the features and no time limit, but exports are given a watermark.
Blackmagic Design’s Fusion(opens in new tab) is an in-depth visual effects workspace. And the long list of Hollywood productions it’s been used on tells you all you need to know about the strength of its features, which include advanced VR and 3D compositing.
In Fusion’s node-based effects workspace, you can connect various effects and filters together through a visual web of nodes. That pro-level system can appear complex to newcomers. Like After Effects, this free editing software has a steeper learning curve, which takes some time to master. But once you’ve got used to it, it’s a very efficient way to build up combinations of effects.
Fusion is built into the stellar DaVinci Resolve, Blackmagic’s all-in-one post-production program. Like HitFilm Pro, you’re free to edit your project in Resolve’s editing workspace, then click the tabs bar to switch over to Fusion and add effects.
The best thing about this After Effects alternative is that the standard DaVinci Resolve 18 suite is completely free. For some advanced features, such as faster network-based rendering, the premium version, simply called Fusion 18, costs a one-off fee of $295.
Best After Effects alternative for Final Cut Pro editors
Reasons to buy
+
Works great with Final Cut Pro
+
Great title design features
+
Easy to learn
Reasons to avoid
-
Lacks depth of After Effects
-
Mac only
Apple Motion(opens in new tab), Apple’s Mac-only visual effects software, can’t match Adobe After Effects in advanced functionality. It’s simple, limited, and functional. But it pairs superbly well with Final Cut Pro - you can see how Apple's video editor compares to its top rival in our guide to Adobe Premiere Pro vs Apple Final Cut Pro.
Motion features the silky smooth experience you’d expect from the always aesthetically pleasing Apple. Running on macOS, Motion is optimized for Apple hardware. In operation, the easy-flowing interface makes it one of the best alternatives to After Effects if you want a program that’s by far easier to learn.
Power users will delight in the faster rendering times compared to Adobe’s special effects app.
While it lacks the same depth of functionality, falling short when it comes to advanced compositing, for example, the VFX tool still manages to pack some impressive features. Not to be missed is the very comprehensive text design & animation tools.
Motion is available for an affordable one-off fee of $49.99 / £44.99. This gives you the software plus any future updates for a lower cost than a three-month After Effects subscription.
(Image credit: Blender)
4. Blender
Best alternative to After Effects for 3D animation
Reasons to buy
+
Open-source
+
Great for 3D animation
Reasons to avoid
-
Less suited to 2D effects
Open-source, free alternative to After Effects Blender(opens in new tab) is tailored to 3D animation. It supports the whole pipeline of 3D modeling, sculpting, animation, compositing, and editing.
Blender’s toolset for designing and animating characters enables a very impressive level of control. However, when it comes to applying visual effects on top of the animations, it’s not the best alternative - After Effects has a depth to it unmatched by Blender.
As a node-based VFX tool, the learning curve is steep and the process slow. It can’t handle 2D graphics or elements like animated text like AE. But there is an extremely supportive online community always willing to help with any issues you might face.The official Blender training hub also houses online courses and workshops led by experts.
That’s the only price animators pay for free 3D modeling software that in some ways is a much more powerful alternative to After Effects.
(Image credit: Natron)
5. Natron
Best open-source Adobe After Effects alternative
Reasons to buy
+
Open-source
+
Good masking and keying tools
Reasons to avoid
-
Poor motion graphics tools
-
Slow development
Natron(opens in new tab) is a comprehensive open-source VFX program, ideal for innovative creatives on a budget looking for free Adobe After Effects alternatives.
Focusing on 2D and 2.5D effects, Natron delivers professional results through useful features, including powerful keying tools, flexible rotoscoping, and unlimited masks and mattes. Further boosting its toolkit are the wide range of community-created plug-ins available.
However, the node-based visual effects software isn’t built to tackle motion graphics or 3D tools. Development has been undeniably sluggish over the past few years, with no latest major updates. It’s not as stable as the best Adobe After Effects alternatives like HitFilm Pro and Fusion. Some performance issues have been reported, too.
But it is free. It is powerful. It is a capable open-source alternative to After Effects. If you need a professional-grade, 2D node-based editor, Natron is a great choice.
(Image credit: Wondershare)
6. FilmoraPro
Best alternative to After Effects for beginners
Reasons to buy
+
Easy user experience
+
Good for low-end PCs and laptops
+
Ideal for YouTubers and streamers
Reasons to avoid
-
Not industry-grade
-
Arguably too simple
FilmoraPro(opens in new tab) is one of the best alternatives to After Effects, thanks to a decent set of VFX and motion graphics tools.
Like Wondershare’s consumer-level video editor Filmora - one of the best Adobe Premiere Pro alternatives - FilmoraPro’s drag-and-drop interface is simple and intuitive. You’ll have no problem placing effects on the timeline, and control key A/V elements, from noise reduction to color correction.
That simplicity is deceptive, though. As the name suggests, this is a VFX app designed for professionals. While it’s good for beginners, it’s perfect for intermediate editors and visual effects artists looking to hone their craft.
The quality won’t win any Oscars for professionalism, but it’s a good choice of video editor for YouTubers and indie filmmakers just starting out. Streamers will also get use from FilmoraPro, thanks to the built-in screen recorder.
FilmoraPro is available for a one-time purchase of $149.99 / £159.05 - ideal if you’re looking for After Effects alternatives to avoid paying a subscription. Commercial licenses are also available, starting at $155.88 / £123.97 per user per year.
Hardware requirements are fairly modest for a visual effects tool. Your rig doesn’t have to be the best video editing computer to create impressive visual effects, making it one of the best Adobe After Effects alternatives for low-end PCs
(Image credit: Autodesk)
7. Autodesk Smoke
Best After Effects alternative for Mac
Reasons to buy
+
Professional-grade results
+
Visual effects and editing app
+
Node-based and timeline-based
Smoke(opens in new tab) is a Mac-only video effects software that blends timeline- and node-based editing for increased productivity and a more well-rounded workflow.
The user-friendly editor lets you add post-production polish. The high-quality kit includes VFX essentials like keying, compositing, color correction, and 3D effects.
Autodesk, the developer behind Smoke, is also responsible for some of the best architecture software. This is a company adept at creating top-quality, professional-grade graphic design tools for SMEs and enterprises.
The cost reflects this. Smoke costs $225 / $246 a month or $1785 / $1968 a year. And where there’s Smoke, there’s…Flame - with seamless integration with Autodesk’s 3D visual effects suite.
As you’d expect from a top-tier VFX app, results are impressive, rivaling any of the best Adobe After Effects alternatives.
Pricey and professional. If it’s in your budget, Smoke could set your world on fire.
Adobe After Effects alternatives: FAQs
Adobe Premiere Pro vs Adobe After Effects: what’s the difference?
Adobe Premiere Pro vs Adobe After Effects isn’t quite the right way to think about these two video post-production tools.
There are some similarities between Adobe’s industry-standard creative programs - with both earning 5-stars in our hands-on reviews. The tools are incredibly powerful, letting content creators easily cut clips and add effects thanks to that easy user experience Adobe excels at. And both deliver exceptional professional polish to video projects.
What’s the difference between Premiere Pro and After Effects? Adobe Premiere Pro is video editing software for editors. Adobe After Effects is VFX software, used by designers and artists to add special effects to shot footage.
The two tools are best deployed in different parts of the post-production process. Crucially, though, Adobe has designed Premiere Pro and After Effects to work seamlessly together, creating a more efficient workflow for content creators and film & TV professionals who use multiple Adobe Creative Cloud apps.
How to choose the best Adobe After Effects alternative for you
Why you can trust TechRadar Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.
When choosing which alternative to Adobe After Effects is best, start by determining your VFX skills, and how you’ll use the tool. Professional-grade visual effects software offers industry-standard effects, but beginners and enthusiasts may want to start with a simpler, or less expensive option.
Finding the right tool for the right platform is important - particularly if you favor a particular video editing tool, such as Premiere Pro or Final Cut Pro. You’ll also need to decide if you want the full visual effects suite, or you need high-performance in a specific area, such as 3D animation. Don't ignore additional extras and nice-to-have's, like FilmoraPro's screen recording software.
If you’re experienced with the Adobe stack, look for products that mirror After Effects - particularly those with an easy-to-use interface and similar features. It’ll make it easier to get creative with software if you’re already familiar with AE.
Pricing models vary between the best Adobe After Effects alternatives. Creative Cloud subscriptions can be a deal-breaker, so check how you’ll pay. You’ll find similar subscription models, one-off payments, and even free alternatives to After Effects. Select the product that best suits your creative budget.
How we test the best alternatives to Adobe After Effects
When determining the very best Adobe After Effects alternatives, we review each product based on interface, performance on devices, the size of the toolkit, and how users will actually use the VFX software.
After Effects is famously easy and intuitive to use - like most Adobe programs, once you understand the basics, it’s incredibly powerful. Alternatives to AE should feature experiences that are just as simple to use - even with a steep learning curve, we want to make sure users can navigate and create in a way that makes sense.
We also review how well the VFX software runs, and how rendering effects can impact computer performance.
Recognizing that the best After Effects alternatives are often built for different skill-levels and purposes, we test the software based on its intended use and audience.
Pricing models are also a factor. A Creative Cloud subscription isn’t for everyone - and where one-off purchases replace subscription models, we’ll note this. When it comes to the best free Adobe After Effects alternatives, the software should be genuinely free, without hidden charges or hamstrung performance.
Wed, 08 Feb 2023 04:12:00 -0600Kieron Mooreentext/htmlhttps://www.techradar.com/best/adobe-after-effects-alternativesKillexams : Why LAPD Officers Accused of Misconduct Often Remain on the Force
This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.
It's the golden hour on a beach somewhere, and sun-kissed waves are crashing around two surfers as they venture into the ocean. The scene is captured in a captivating aerial photo taken with a drone by 'Jane Eykes', and the image, a cropped version of which you can see above, won a photography contest hosted by Australian photo retailer digiDirect.
CONSTELLATION BRANDS, INC.
But all is not as it seems. It was not Jane Eykes who took the photo, nor is this even a photograph at all, as we know it – it's an entirely AI-generated image created by Australian company Absolutely Ai (under that pseudonym).
Using Midjourney, an image-generation program in the mold of Dall-E, fed with with simple prompts such as ‘two surfers, sunrise, beautiful lighting, drone shot, wave crashing’, Absolutely Ai entered the resulting image into the competition as a test of how far AI-generated images have come. Pretty far, it turns out.
“The surfers in our image never existed. Neither does that particular beach or stretch of ocean,” Absolutely Ai says. “[the image is] made up of an infinite amount of pixels taken from infinite photographs that have been uploaded online over the years by anyone and everyone, and what you’re left with is a new, entirely convincing award winner.”
This story adds a new thread the snowballing conversation around AI in art. I spoke with Absolutely Ai's founder Jamie Sissons, an award-winning professional photographer – in the documentary genre, ironically – about the significance of AI-generated imagery for the organizers of and entrants to photography competitions, and for the wider photography world.
Just how good are AI-ggenerated images right now?
In the 24 hours after digiDirect shared the image as the winner of its ‘Summer Photo’-themed contest on Instagram, there were plenty of complimentary comments about it. Put simply, no one thought the image was suspect.
Absolutely Ai then publicly confessed its experiment to digiDirect and forwent the prize money, and the story made the news across Australia. Now that it's in the spotlight, the winning image has come under intense scrutiny, especially from photographers. That scrutiny is less about its aesthetic quality – it’s a lovely looking drone shot – but whether it is convincing or not.
“I say it is a convincing image because no one had the reason to think otherwise”, Sissons told me. “There are things that don’t look right with it – I’d say it’s over-saturated, the wave doesn’t quite crash the right way, the run-off, the lines through the waves aren’t quite right. But even when you’re having a good look at it, it’s a convincing image.”
And that's really the point – 95% of people do not have the critical eye for image detail and the time and/or inclination to pore over an image in great detail. We swipe our screens, pause for a moment when an image catches our eye, double-tap to like it, then scroll some more.
But this story pushes another button, especially for photographers, because the image should have been spotted as a ‘fake’. After all, this was a photography contest, judged by photography professionals, that awards photographers for their creative endeavors, and the professionals were taken in by an image that took a few word prompts to create (and from a huge pool of photos from almost entirely unknown sources, which is a whole other issue).
And this is only a taste of what's to come. “These are still the very first iterations of what we will see from AI tech,” says Sissons. “A lot of these platforms and apps are still in testing phases, so in a year, two years, five years, who knows what it will look like?”
Should photographers be thinking about AI-generated images?
AI images are not perfect. One known pitfall is hands – how many people do you know with six fingers? And AI can struggle to create a realistic image when the prompts are really specific' Jamie uses 'the queen playing badminton with a polar bear' as an example. Keep the prompts broad, however, and AI is already a frighteningly effective tool.
“For my winning image, the prompts were general and could be portrayed in a million different ways,” Sissons explains. “AI is also great at doing ideas: A lonely person – it will come up with something that really fits the bill. But if you go specific – a lonely child sitting on a bench, it’s raining, the bus is late – the more it will struggle. The wider you keep it, the better the result.”
General ideas presented through images are bread-and-butter marketing and social media for businesses with an insatiable appetite for new content. “There will still be a need for photographers to cover specific ideas, but the work around broader themes in photography is under threat,” adds Sissons. ” I’d be thinking if I was one of the big stock libraries.”
Indeed, when it comes to stock libraries, Getty is fighting back, suing AI image generator Stable Diffusion for $1.8 trillionfor what it believes is to be "brazen" intellectual property theft on a "staggering scale" after its watermark began appearing on Stability AI-generated images.
Man vs machine: the next chapter
Technological advances in photography – think digital photography, Adobe Photoshop and the iPhone – have historically been met with mixed, and often highly emotive, reactions, and it's no different with AI. I contacted the World Photography Organisation for comment about how AI-generated images could impact photography competitions, and received the following statement from Founder and CEO Scott Gray:
“As a medium photography has always been at the forefront: constantly adapting and evolving, it has a singular ability to transform itself and push boundaries. We are interested in photography as an art form, and within the Sony World Photography Awards we have our Creative Categories in the Professional and Open Competitions which welcome photographers to experiment and explore the dynamism of the medium.
“With technological advancements, a wider audience of creators are engaging with lens-based work and we look forward to seeing how this can expand the reach and impact of photography.”
After Absolutely Ai revealed the true nature of its contest-wining image, digiDirect publicly acknowledged that it had indeed mistakenly awarded its prize to an AI-generated image, and chose a new winner.
For future photography contests, the photo retailer will request that winning entrants submit the raw image of their edited entries, which includes metadata about the camera used, as a ensure of authenticity – this is already established practice for high-profile contests like the World Photography Organisation's Sony World Photography Awards.
Upping the ante, digiDirect announced a new competition that will accept photo or image submissions. The prize money has been increased, and an expert panel of photographers will judge the entries, without knowing if the submissions have been created by humans using a camera, or artificially. It’s man versus machine – and as a photographer, I know who I’m rooting for.