Tag: Amazon Web Services

  • AWS, Microsoft Azure, Google Cloud Now Dominate 66% Of Global Cloud Spending | Technology News

    New Delhi: As artificial intelligence (AI) becomes a key demand driver for cloud investment, Amazon Web Services (AWS), Microsoft Azure and Google Cloud collectively grew by 24 per cent, accounting for 66 per cent of total spending in the first quarter (January-March) period.

    Tech giant Microsoft outpaced both AWS and Google Cloud, with sales rising by 31 per cent (year-on-year), nearly double the AWS’s growth rate of 17 per cent, while Google Cloud grew 28 per cent YoY, according to global market research firm Canalys. (Also Read: IT Giant Cognizant Issues Warning For Employees: Return To Office Or Get Fired)

    The report mentioned that despite holding the largest market share at 31 per cent, AWS faces increasing competition from its fast-growing competitors. Microsoft Azure was second with a market share of 25 per cent and Google Cloud was third, with a market share of 10 per cent in the first quarter this year. (Also Read: RBI Revises Timeline For Government Treasury Bill Auctions)

    Earlier this month, the Amazon-run company announced the departure of its CEO Adam Selipsky after three years in the role. Matt Garman will become CEO of AWS, effective June 3. Yi Zhang, an analyst at Canalys, said that Microsoft’s end-to-end portfolio is proving to be a strong “competitive moat”, while Google’s strength in AI is giving it a strong tailwind.

    As enterprises embrace AI-driven initiatives, there is a potential need to transfer their workloads and data to cloud platforms to avail themselves of essential computing and storage capacities, the report mentioned. Meanwhile, global cloud infrastructure services expenditure grew by 21 per cent in the first quarter to reach $79.8 billion, an increase of $13.4 billion YoY.

  • TCS Achieves New Milestone; 3.5 lakh Employees Trained In Generative AI Skills

    It became the inaugural tech company to establish a specialized business division solely focused on AI and cloud services in 2023.

  • Chances are high that you haven’t used A.I. to plot a holiday. That’s about to modify

    In step with an international survey of greater than 5,700 vacationers commissioned via Expedia Staff, the common traveler spends greater than 5 hours researching a commute and evaluations 141 pages of content material — for American citizens, it is a whopping 277 pages.

    And that’s the reason simply within the ultimate 45 days prior to departing.

    Input generative synthetic intelligence — a era set to simplify that procedure, and make allowance firms to raised tailor suggestions to vacationers’ particular pursuits.

    What may just that seem like? The hope is that AI won’t most effective plan itineraries, however keep up a correspondence with inns, draft go back and forth budgets, even serve as as a private go back and forth assistant — and within the procedure basically regulate the best way firms way vacationers.

    A normal house seek on Airbnb, as an example, produces effects that do not take previous searches into consideration. You might have a decade of reserving upscale, fresh properties below your belt, however you’ll be able to most likely nonetheless be presented rustic, salt-of-the-earth leases in the event that they fit the filters you’ve gotten set.

    However that might quickly exchange.

    Throughout an profits name in Would possibly, CEO Brian Chesky mentioned how AI may just regulate Airbnb’s way. He stated: “As a substitute of asking you questions like: ‘The place are you going, and when are you going?’ I need us to construct a powerful profile about you, be informed extra about you and ask you two larger and extra elementary questions: Who’re you, and what do you need?”

    Whilst AI that gives the ever-elusive purpose of “personalization at scale” is not right here but, it is the skill to go looking large quantities of knowledge, reply to questions requested the use of herbal language and “bear in mind” previous inquiries to construct on a dialog — the best way people do — that has the go back and forth business (and plenty of others) bought.

    Trip firms the use of A.I.

    In a survey performed in April via the marketplace analysis company Nationwide Analysis Staff, 61% of respondents stated they are open to the use of conversational AI to plot journeys — however most effective 6% stated they in fact had.

    Moreover, greater than part of respondents (51%) stated that they did not agree with the tech to give protection to their non-public knowledge, whilst 33% stated they feared it should supply misguided effects.

    But whilst vacationers are nonetheless debating the protection and deserves of the use of AI for commute making plans, many main go back and forth firms are already diving headfirst into the era.

    Simply have a look at the names in this record.

    In February, the Singapore-based go back and forth corporate Commute.com introduced TripGen, an in-app chatbot powered via OpenAI, the maker of ChatGPT.In March, Expedia and Kayak had been a number of the first batch of plugins rolled out via ChatGPT.In April, Expedia introduced a beta release of a AI chatbot from ChatGPT.In Would possibly, the Europe-based go back and forth reserving corporate eDreams Odigeo joined Google Cloud’s AI “Depended on Testers Program,” and Airbnb introduced plans to construct GPT-4, OpenAI’s latest massive language style, into its interface.A summer season explosion of go back and forth A.I.

    Then the summer season of 2023 noticed a burst of AI go back and forth tech bulletins.

    In June:

    Amazon Internet Products and services introduced an funding of $100 million right into a program to lend a hand firms use generative AI, with RyanAir and Lonely Planet as two of the primary 4 firms concerned.Reserving.com rolled out an in-app “Commute Planner” AI chatbot to choose U.S. individuals of its Genius loyalty program.Priceline introduced a platform known as Commute Intelligence, led via a Google-backed generative AI chatbot named “Penny.”

    HomeToGo’s new “AI Mode” permits vacationers to seek out holiday apartment properties the use of herbal language requests.

    Supply: HomeToGo

    In July:

    Tripadvisor introduced an internet, AI-powered go back and forth itinerary maker known as Journeys.Commute.com launched an up to date chatbot known as TripGenie, which responds to textual content and voice requests, presentations photographs and maps, and gives hyperlinks for bookings.The vacation house apartment corporate HomeToGo beta introduced an in-app AI seek serve as known as “AI Mode” for customers in america and United Kingdom.

    Now, extra go back and forth firms have ChatGPT plugins, together with GetYourGuide, Klook, Turo and Etihad Airlines. And a slew of AI-powered commute planners — from Roam Round (for normal go back and forth), AdventureGenie (for leisure cars), Curiosio (for highway journeys) — added extra choices to the rising AI go back and forth making plans marketplace.  

    Past go back and forth making plans

    Trip making plans is probably the most visual use of AI within the go back and forth business presently, however firms are already making plans new options.

    Commute.com’s Senior Product Director Amy Wei stated the corporate is thinking about creating a digital go back and forth information for its newest AI product, TripGenie.

    “It could possibly lend a hand supply knowledge, akin to an creation to historic constructions and items in a museum,” she informed CNBC. “The imaginative and prescient is to create a virtual go back and forth better half that may perceive and speak with the traveler and supply help at each step of the adventure.”

    The go back and forth information web page Skift issues out AI could also be used to are expecting flight delays and lend a hand go back and forth firms reply to unfavorable on-line evaluations.

    The corporate estimates chatbots may just deliver $1.9 billion in worth to the go back and forth business — via permitting firms to perform with leaner customer support workforce, liberating up time for people to concentrate on advanced problems. Chatbots don’t need to be employed or educated, can discuss a couple of languages, and “haven’t any studying curve,” as Skift issues out in a file titled “Generative AI’s Affect on Trip.”

    Total, Skift’s file predicts generative AI generally is a $28.5 billion alternative for the go back and forth business, an estimate that if the equipment are used to “their complete doable … will glance conservative in hindsight.”

  • How Amazon is racing to catch Microsoft and Google in generative A.I. with customized AWS chips

    In an unmarked administrative center development in Austin, Texas, two small rooms comprise a handful of Amazon staff designing two kinds of microchips for coaching and accelerating generative AI. Those customized chips, Inferentia and Trainium, be offering AWS shoppers an alternative choice to coaching their massive language fashions on Nvidia GPUs, which were getting tough and dear to acquire. 

    “All of the global would love extra chips for doing generative AI, whether or not that is GPUs or whether or not that is Amazon’s personal chips that we are designing,” Amazon Internet Products and services CEO Adam Selipsky informed CNBC in an interview in June. “I believe that we are in a greater place than any one else on Earth to provide the capability that our shoppers jointly are going to need.”

    But others have acted sooner, and invested extra, to seize industry from the generative AI increase. When OpenAI introduced ChatGPT in November, Microsoft received standard consideration for webhosting the viral chatbot, and making an investment a reported $13 billion in OpenAI. It used to be fast so as to add the generative AI fashions to its personal merchandise, incorporating them into Bing in February. 

    That very same month, Google introduced its personal massive language type, Bard, adopted through a $300 million funding in OpenAI rival Anthropic. 

    It wasn’t till April that Amazon introduced its personal circle of relatives of enormous language fashions, known as Titan, at the side of a carrier known as Bedrock to lend a hand builders fortify device the use of generative AI.

    “Amazon isn’t used to chasing markets. Amazon is used to making markets. And I believe for the primary time in a very long time, they’re discovering themselves at the again foot and they’re running to play catch up,” mentioned Chirag Dekate, VP analyst at Gartner.

    Meta additionally just lately launched its personal LLM, Llama 2. The open-source ChatGPT rival is now to be had for other folks to check on Microsoft’s Azure public cloud.

    Chips as ‘true differentiation’

    Ultimately, Dekate mentioned, Amazon’s customized silicon may just give it an edge in generative AI. 

    “I believe the actual differentiation is the technical functions that they are bringing to undergo,” he mentioned. “As a result of bet what? Microsoft does no longer have Trainium or Inferentia,” he mentioned.

    AWS quietly began manufacturing of customized silicon again in 2013 with a work of specialised {hardware} known as Nitro. It is now the highest-volume AWS chip. Amazon informed CNBC there’s no less than one in each AWS server, with a complete of greater than 20 million in use. 

    AWS began manufacturing of customized silicon again in 2013 with this piece of specialised {hardware} known as Nitro. Amazon informed CNBC in August that Nitro is now the easiest quantity AWS chip, with no less than one in each AWS server and a complete of greater than 20 million in use.

    Courtesy Amazon

    In 2015, Amazon purchased Israeli chip startup Annapurna Labs. Then in 2018, Amazon introduced its Arm-based server chip, Graviton, a rival to x86 CPUs from giants like AMD and Intel.

    “Almost definitely excessive single-digit to perhaps 10% of general server gross sales are Arm, and a just right bite of the ones are going to be Amazon. So at the CPU facet, they have performed reasonably neatly,” mentioned Stacy Rasgon, senior analyst at Bernstein Analysis.

    Additionally in 2018, Amazon introduced its AI-focused chips. That got here two years after Google introduced its first Tensor Processor Unit, or TPU. Microsoft has but to announce the Athena AI chip it is been running on, reportedly in partnership with AMD. 

    CNBC were given a behind-the-scenes excursion of Amazon’s chip lab in Austin, Texas, the place Trainium and Inferentia are evolved and examined. VP of product Matt Picket defined what each chips are for.

    “Device finding out breaks down into those two other levels. So that you teach the device finding out fashions and you then run inference in opposition to the ones educated fashions,” Picket mentioned. “Trainium supplies about 50% development when it comes to value efficiency relative to some other method of coaching device finding out fashions on AWS.”

    Trainium first got here in the marketplace in 2021, following the 2019 liberate of Inferentia, which is now on its moment era.

    Trainum lets in shoppers “to ship very, very cheap, high-throughput, low-latency, device finding out inference, which is the entire predictions of while you kind in a recommended into your generative AI type, that is the place all that will get processed to provide the reaction, ” Picket mentioned.

    For now, then again, Nvidia’s GPUs are nonetheless king in the case of coaching fashions. In July, AWS introduced new AI acceleration {hardware} powered through Nvidia H100s. 

    “Nvidia chips have an enormous device ecosystem that is been constructed up round them during the last like 15 years that no one else has,” Rasgon mentioned. “The massive winner from AI at this time is Nvidia.”

    Amazon’s customized chips, from left to proper, Inferentia, Trainium and Graviton are proven at Amazon’s Seattle headquarters on July 13, 2023.

    Joseph Huerta

    Leveraging cloud dominance

    AWS’ cloud dominance, then again, is a large differentiator for Amazon.

    “Amazon does no longer wish to win headlines. Amazon already has a in reality robust cloud set up base. All they wish to do is to determine how one can permit their present shoppers to amplify into price advent motions the use of generative AI,” Dekate mentioned.

    When opting for between Amazon, Google, and Microsoft for generative AI, there are thousands of AWS shoppers who could also be attracted to Amazon as a result of they are already acquainted with it, operating different programs and storing their information there.

    “It is a query of pace. How temporarily can those firms transfer to increase those generative AI programs is pushed through beginning first at the information they’ve in AWS and the use of compute and device finding out gear that we offer,” defined Mai-Lan Tomsen Bukovec, VP of era at AWS.

    AWS is the sector’s largest cloud computing supplier, with 40% of the marketplace proportion in 2022, in step with era business researcher Gartner. Even if running source of revenue has been down year-over-year for 3 quarters in a row, AWS nonetheless accounted for 70% of Amazon’s total $7.7 billion running benefit in the second one quarter. AWS’ running margins have traditionally been a ways wider than the ones at Google Cloud.

    AWS additionally has a rising portfolio of developer gear excited about generative AI.

    “Let’s rewind the clock even prior to ChatGPT. It is not like after that took place, we moved quickly and got here up with a plan as a result of you’ll be able to’t engineer a chip in that fast a time, let on my own you’ll be able to’t construct a Bedrock carrier in a question of two to a few months,” mentioned Swami Sivasubramanian, AWS’ VP of database, analytics and device finding out.

    Bedrock offers AWS shoppers get entry to to very large language fashions made through Anthropic, Steadiness AI, AI21 Labs and Amazon’s personal Titan.

    “We do not imagine that one type goes to rule the sector, and we would like our shoppers to have the cutting-edge fashions from a couple of suppliers as a result of they will select the suitable device for the suitable process,” Sivasubramanian mentioned.

    An Amazon worker works on customized AI chips, in a jacket branded with AWS’ chip Inferentia, on the AWS chip lab in Austin, Texas, on July 25, 2023.

    Katie Tarasov

    Certainly one of Amazon’s latest AI choices is AWS HealthScribe, a carrier unveiled in July to lend a hand medical doctors draft affected person consult with summaries the use of generative AI. Amazon additionally has SageMaker, a device finding out hub that gives algorithms, fashions and extra. 

    Any other large device is coding spouse CodeWhisperer, which Amazon mentioned has enabled builders to finish duties 57% sooner on reasonable. Closing 12 months, Microsoft additionally reported productiveness boosts from its coding spouse, GitHub Copilot. 

    In June, AWS introduced a $100 million generative AI innovation “heart.” 

    “We have now such a lot of shoppers who’re announcing, ‘I need to do generative AI,’ however they do not essentially know what that implies for them within the context of their very own companies. And so we are going to usher in answers architects and engineers and strategists and information scientists to paintings with them one on one,” AWS CEO Selipsky mentioned.

    Even if up to now AWS has targeted in large part on gear as a substitute of establishing a competitor to ChatGPT, a just lately leaked inner electronic mail presentations Amazon CEO Andy Jassy is at once overseeing a brand new central crew development out expansive massive language fashions, too.

    Within the second-quarter income name, Jassy mentioned a “very important quantity” of AWS industry is now pushed through AI and greater than 20 device finding out products and services it provides. Some examples of consumers come with Philips, 3M, Outdated Mutual and HSBC. 

    The explosive expansion in AI has include a flurry of safety considerations from firms nervous that staff are striking proprietary data into the educational information utilized by public massive language fashions.

    “I will be able to’t let you know what number of Fortune 500 firms I have talked to who’ve banned ChatGPT. So with our way to generative AI and our Bedrock carrier, anything else you do, any type you utilize thru Bedrock will likely be to your personal remoted digital non-public cloud setting. It’s going to be encrypted, it is going to have the similar AWS get entry to controls,” Selipsky mentioned.

    For now, Amazon is handiest accelerating its push into generative AI, telling CNBC that “over 100,000” shoppers are the use of device finding out on AWS these days. Even if that is a small share of AWS’s thousands and thousands of consumers, analysts say that might exchange.

    “What we don’t seem to be seeing is enterprises announcing, ‘Oh, wait a minute, Microsoft is so forward in generative AI, let’s simply cross out and let’s transfer our infrastructure methods, migrate the whole lot to Microsoft.’ Dekate mentioned. “If you are already an Amazon buyer, likelihood is that you might be most likely going to discover Amazon ecosystems reasonably widely.”

    — CNBC’s Jordan Novet contributed to this record.

  • Jeff Bezos assists in keeping a 16-year-old framed mag as a ‘reminder’ that Amazon’s maximum winning carrier used to be as soon as only a ‘dangerous wager’

    Some other folks have framed diplomas. Others have framed footage with celebrities. Jeff Bezos has a framed 16-year-old replica of Businessweek mag.

    On Wednesday, the Amazon founder tweeted a photograph of the November 2006 mag’s quilt, which featured a photograph of Bezos at age 42 at the back of the textual content, “Amazon’s Dangerous Wager.” The quilt tale used to be about why Wall Boulevard executives doubted that Amazon Internet Services and products, then a brand-new on-demand cloud computing carrier, would ever be triumphant.

    “I’ve this previous 2006 BusinessWeek framed as a reminder,” Bezos, now 58, wrote within the tweet. “The ‘dangerous wager’ that Wall Boulevard disliked used to be AWS, which generated earnings of greater than $62 billion final 12 months.”

    In 2006, Amazon used to be best price a trifling $10 billion, consistent with Businessweek – and traders and analysts have been “dropping self belief in Bezos’ guarantees.” The thing referred to as out Bezos for occurring an ill-timed spending “binge,” noting that his investments in new applied sciences like cloud computing have been up 52% since January of that 12 months, whilst Amazon’s inventory used to be down 20%.

    In particular, Businessweek deemed Amazon Internet Services and products as “Bezos’ greatest wager since he and his spouse, MacKenzie, drove west in 1994 to hunt popularity and fortune at the Web.”

    These days, the cloud computing platform is understood for serving to revolutionize the arena of on-line marketplaces, and is a big issue at the back of Amazon’s present marketplace capitalization of $1.08 trillion, as of Friday afternoon.

    Closing 12 months, Amazon Internet Services and products made $62.2 billion in earnings, consistent with the corporate’s annual submitting. An income remark previous this 12 months presentations that the platform been in large part answerable for holding Amazon winning to this point in 2022: AWS made $6.52 billion in running source of revenue all over Q1 of 2022, some distance outpacing Amazon’s overall running source of revenue of more or less $3.7 billion.

    Businessweek’s research wasn’t completely unsuitable. Amazon has constructed a name over time for making giant bets on new applied sciences, and the usage of the income from its successes to subsidize its disasters.

    In 2014, Amazon took a $170 million loss for unsold Firephones. In 2019, the corporate closed 87 pop-up retail outlets and close down its eating place supply carrier. Closing 12 months, it discontinued Sprint Buttons, one-click buttons supposed to be fixed round customers’ properties for widespread reorders of goods.

    The disasters do not appear to section Bezos, who ceaselessly says that dangers – and defeats – are the cost of admission to luck.

    “We want giant disasters if we are going to transfer the needle — billion-dollar scale disasters,” Bezos stated at Amazon’s re:Mars convention in 2019. “And if we aren’t, we aren’t swinging arduous sufficient.”

    Join now: Get smarter about your cash and profession with our weekly e-newsletter

    Do not omit:

    Former Twitter CEO: The recommendation Jeff Bezos gave me a decade in the past — that I nonetheless move on these days

    Take a look at this ‘insanely nice’ be offering letter Steve Jobs wrote to rent an worker – who now regrets turning him down