Category: 3. Business

  • [Interview] The Technologies Bringing Cloud-Level Intelligence to On-Device AI – Samsung Global Newsroom

    [Interview] The Technologies Bringing Cloud-Level Intelligence to On-Device AI – Samsung Global Newsroom

    In classic science-fiction films, AI was often portrayed as towering computer systems or massive servers. Today, it’s an everyday technology — instantly accessible on the devices people hold in their hands. Samsung Electronics is expanding the use of on-device AI across products such as smartphones and home appliances, enabling AI to run locally without external servers or the cloud for faster, more secure experiences.

    Unlike server-based systems, on-device environments operate under strict memory and computing constraints. As a result, reducing AI model size and maximizing runtime efficiency are essential. To meet this challenge, Samsung Research AI Center is leading work across core technologies — from model compression and runtime software optimization to new architecture development.

    Samsung Newsroom sat down with Dr. MyungJoo Ham, Master at AI Center, Samsung Research, to discuss the future of on-device AI and the optimization technologies that make it possible.

    ▲ Dr. MyungJoo Ham

    The First Step Toward On-Device AI

    At the heart of generative AI — which interprets user language and produces natural responses — are large language models (LLMs). The first step in enabling on-device AI is compressing and optimizing these massive models so they run smoothly on devices such as smartphones.

    “Running a highly advanced model that performs billions of computations directly on a smartphone or laptop would quickly drain the battery, increase heat and slow response times — noticeably degrading the user experience,” said Dr. Ham. “Model compression technology emerged to address these issues.”

    LLMs perform calculations using extremely complex numerical representations. Model compression simplifies these values into more efficient integer formats through a process called quantization. “It’s like compressing a high-resolution photo so the file size shrinks but the visual quality remains nearly the same,” he explained. “For instance, converting 32-bit floating-point calculations to 8-bit or even 4-bit integers significantly reduces memory use and computational load, speeding up response times.”

    ▲ Model compression quantizes model weights to reduce size, increase processing speed and maintain performance.

    A drop in numerical precision during quantization can reduce a model’s overall accuracy. To balance speed and model quality, Samsung Research is developing algorithms and tools that closely measure and calibrate performance after compression.

    “The goal of model compression isn’t just to make the model smaller — it’s to keep it fast and accurate,” Dr. Ham said. “Using optimization algorithms, we analyze the model’s loss function during compression and retrain it until its outputs stay close to the original, smoothing out areas with large errors. Because each model weight has a different level of importance, we preserve critical weights with higher precision while compressing less important ones more aggressively. This approach maximizes efficiency without compromising accuracy.”

    Beyond developing model compression technology at the prototype stage, Samsung Research adapts and commercializes it for real-world products such as smartphones and home appliances. “Because every device model has its own memory architecture and computing profile, a general approach can’t deliver cloud-level AI performance,” he said. “Through product-driven research, we’re designing our own compression algorithms to enhance AI experiences users can feel directly in their hands.”

    The Hidden Engine That Drives AI Performance

    Even with a well-compressed model, the user experience ultimately depends on how it runs on the device. Samsung Research is developing an AI runtime engine that optimizes how a device’s memory and computing resources are used during execution.

    “The AI runtime is essentially the model’s engine control unit,” Dr. Ham said. “When a model runs across multiple processors — such as the central processing unit (CPU), graphics processing unit (GPU) and neural processing unit (NPU) — the runtime automatically assigns each operation to the optimal chip and minimizes memory access to boost overall AI performance.”

    The AI runtime also enables larger and more sophisticated models to run at the same speed on the same device. This not only reduces response latency but also improves overall AI quality — delivering more accurate results, smoother conversations and more refined image processing.

    “The biggest bottlenecks in on-device AI are memory bandwidth and storage access speed,” he said. “We’re developing optimization techniques that intelligently balance memory and computation.” For example, loading only the data needed at a given moment, rather than keeping everything in memory, improves efficiency. “Samsung Research now has the capability to run a 30-billion-parameter generative model — typically more than 16 GB in size — on less than 3 GB of memory,” he added.

    ▲ AI runtime software predicts when weight computations occur to minimize memory usage and boost processing speed.

    The Next Generation of AI Model Architectures

    Research on AI model architectures — the fundamental blueprints of AI systems — is also well underway.

    “Because on-device environments have limited memory and computing resources, we need to redesign model structures so they run efficiently on the hardware,” said Dr. Ham. “Our architecture research focuses on creating models that maximize hardware efficiency.” In short, the goal is to build device-friendly architectures from the ground up to ensure the model and the device’s hardware work in harmony from the start.

    Training LLMs requires significant time and cost, and a poorly designed model structure can drive those costs even higher. To minimize inefficiencies, Samsung Research evaluates hardware performance in advance and designs optimized architectures before training begins. “In the era of on-device AI, the key competitive edge is how much efficiency you can extract from the same hardware resources,” he said. “Our goal is to achieve the highest level of intelligence within the smallest possible chip — that’s the technical direction we’re pursuing.”

    Today, most LLMs rely on the transformer architecture. Transformers analyze an entire sentence at once to determine relationships between words, a method that excels at understanding context but has a key limitation — computational demands rise sharply as sentences get longer. “We’re exploring a wide range of approaches to overcome these constraints, evaluating each one based on how efficiently it can operate in real device environments,” Dr. Ham explained. “We’re focused not just on improving existing methods but on developing the next generation of architectures built on entirely new methodologies.”

    ▲ Architecture optimization research transfers knowledge from a large model to a smaller one, improving computational efficiency while maintaining performance.

    The Road Ahead for On-Device AI

    What is the most critical challenge for the future of on-device AI? “Achieving cloud-level performance directly on the device,” Dr. Ham said. To make this possible, model optimization and hardware efficiency work closely together to deliver fast, accurate AI — even without a network connection. “Improving speed, accuracy and power efficiency at the same time will become even more important,” he added.

    Advancements in on-device AI are enabling users to enjoy fast, secure and highly personalized AI experiences — anytime, anywhere. “AI will become better at learning in real time on the device and adapting to each user’s environment,” said Dr. Ham. “The future lies in delivering natural, individualized services while safeguarding data privacy.”

    Samsung is pushing the boundaries to deliver more advanced experiences powered by optimized on-device AI. Through these efforts, the company aims to provide even more remarkable and seamless user experiences.

    Continue Reading

  • Hyundai CRATER Concept Makes Global Debut at Automobility LA 2025

    Hyundai CRATER Concept Makes Global Debut at Automobility LA 2025

    Design Highlights | Art of Steel

    The Art of Steel exterior design language transforms the strength and flexibility of steel into a language of sculptural beauty. Inspired by Hyundai Motor’s advanced steel technologies, the material’s natural formability reveals flowing volumes and precise lines that evoke the distinctive aesthetic quality of steel — powerful, gentle and timeless.

    Exterior Design Theme: The Impact of Adventure

    CRATER Concept’s exterior design was guided by a clear goal: to shape a rugged and capable form that reflects the landscapes that it’s inspired by. This informed every detail — from the chiseled bodysides to the bold skid plates — resulting in a concept that visually communicates strength, resilience, and purpose.

    Compact Concept’s Proportions

    CRATER Concept’s proportions reflect an adventurous spirit. Built on a compact monocoque architecture, CRATER Concept has been designed to go anywhere.

    Adventurous Silhouette

    CRATER Concept is highlighted by its bold silhouette, complemented by its steep approach and departure angles which support serious off-road exploration.

    Hexagonal Faceted Wheels

    CRATER Concept’s 18-inch wheels were inspired by envisioning a hexagonal asteroid impacting a sheer metal landscape, leaving a fractal crater in its aftermath. The design evokes an off-road spirit, blending ruggedness with precision. The wheels are clad in generous 33-inch off-road tires, enabling superior traction and ground clearance for performance in all environments.

    Wide Skid Plate

    A wide, functional skid plate stretches across CRATER Concept’s underbody, not only for added protection, but to visually anchor the vehicle. Its sheer surface and robust form express protection and capability.


    Continue Reading

  • Intuit expects quarterly revenue growth above estimates on strong financial tools demand

    Intuit expects quarterly revenue growth above estimates on strong financial tools demand

    Nov 20 (Reuters) – Intuit (INTU.O), opens new tab forecast second-quarter revenue growth above Wall Street estimates on Thursday, a sign of growing demand for its artificial intelligence-powered financial management tools.

    Shares of the company rose around 3% in extended trading.

    Sign up here.

    The company, which offers products such as tax-preparation software TurboTax, finance portal Credit Karma and accounting tool QuickBooks, is benefiting as customers increasingly seek personalized financial guidance and automated solutions for tasks such as bookkeeping.

    On Tuesday, Intuit signed a multi-year deal worth more than $100 million with OpenAI to use the ChatGPT maker’s AI models to power the company’s AI agents.

    The integration of Intuit apps within ChatGPT will involve “no revenue share”, and customer data privacy and security principles will remain unchanged, CEO Sasan Goodarzi said on the post-earnings call.

    Earlier in the day, the company named ServiceNow (NOW.N), opens new tab CEO Bill McDermott and Nasdaq (NDAQ.O), opens new tab CEO Adena Friedman to its board, effective August 2026, while Goodarzi is set to become board chair on January 22, 2026.

    Intuit forecast revenue growth of about 14% to 15% for the second quarter ending January 31, above analysts’ average estimate of 12.8% growth, according to data compiled by LSEG.

    However, its adjusted earnings per share outlook of $3.63 to $3.68 for the quarter fell short of the estimated $3.83.

    Revenue for the first quarter rose 18% to $3.89 billion, handily beating estimates of $3.76 billion.

    Adjusted EPS of $3.34 also exceeded estimates of $3.09 for the quarter ended October 31.

    “We are confident in delivering double-digit revenue growth and expanding margin this year, and we are reiterating our full-year guidance for fiscal 2026,” finance chief Sandeep Aujla said.

    The board also approved a quarterly dividend of $1.20 per share, a 15% increase from a year ago.

    Reporting by Jaspreet Singh in Bengaluru; Editing by Vijay Kishore

    Our Standards: The Thomson Reuters Trust Principles., opens new tab

    Continue Reading

  • CAST Research Fellow Dr. Qi Zhang is transforming pediatric heart care through AI innovation – News

    CAST Research Fellow Dr. Qi Zhang is transforming pediatric heart care through AI innovation – News

    Dr. Qi Zhang is a 2025–26 CAST Research Fellow and an associate professor in the School of Information Technology, leading transformative research to improve pediatric heart diagnostics using artificial intelligence. His work focuses on an AI-assisted system that supports quicker and more equitable pediatric cardiac care.  

    “Most AI tools in cardiology are built for adults, yet children’s hearts develop differently and deserve their own technologies. I was drawn to the challenge of designing AI that truly serves young patients,” Zhang said. “Seeing how early diagnosis shapes a child’s life makes this work deeply personal for me.” 

    Project overview 

    This project focuses on data online. It turns complex ECG data into clear, shareable insights that support better and more equitable care. 

    Zhang mentioned that partnerships with OSF HealthCare and Illinois State University kept the project grounded, providing all the necessary elements. OSF provides pediatric data and medical expertise, while Illinois State supplies high-performance computing, AI expertise, and student researchers who advance the technical work.  

    “Together, we bridge the gap between engineering and medicine, translating laboratory research into practical tools that can improve patient care,” Zhang said. 

    “Together, we bridge the gap between engineering and medicine, translating laboratory research into practical tools that can improve patient care.”

    Dr. Qi Zhang

    Student involvement 

    Zhang mentioned that Illinois State students are integral to the project’s success. Under his mentorship, they contribute to his research by developing code modules, analyzing cardiac data, and designing visualization tools that make complex medical information easier to interpret. 

    “Students’ participation not only strengthens the project’s outcomes but also provides them with valuable, hands-on experience in applying artificial intelligence to real-world health care challenges,” Zhang noted.

    “Students’ participation not only strengthens the project’s outcomes but also provides them with valuable, hands-on experience in applying artificial intelligence to real-world healthcare challenges.”

    Dr. Qi Zhang

    Challenges and solutions 

    According to Zhang, developing AI and telemedicine tools for children presents unique challenges in terms of data accuracy and secure data sharing. Zhang and his team focus on refining data quality through collaboration with clinicians and implementing encrypted communication systems that ensure patient privacy. 

    Moreover, their work also addresses health care inequality. Many rural families live hours away from pediatric specialists. The AI system will enable local doctors to securely share ECG data with experts, thereby helping to shorten diagnosis times and improve outcomes. “Our goal is to make advanced cardiac care more accessible to every child, regardless of where they live,” Zhang said.

    “Our goal is to make advanced cardiac care more accessible to every child, regardless of where they live.”

    Dr. Qi Zhang

    Zhang’s team plans to expand the system to include real-time heart monitoring and early risk prediction for pediatric heart conditions. Future phases will integrate wearable sensors and imaging data to build a broader platform that supports both diagnosis and clinician training. 

    Support from CAST 

    According to Zhang, the College of Applied Science and Technology research fellowship and internal funding have given him the time and support to innovate while mentoring students who share a passion for applying AI for social good.

    “The support from CAST has allowed me to turn technical concepts into meaningful initiatives that benefit both patients and students,” Zhang said. 

    Through innovation, collaboration, and mentorship, Zhang’s research reflects his commitment to advancing technology and improving lives by transforming the future of pediatric cardiology.

    Learn more about the College of Applied Science and Technology.

    Continue Reading

  • Smarter AI processing, cleaner air | UCR News

    Smarter AI processing, cleaner air | UCR News

    As artificial intelligence becomes more powerful and widespread, so does the environmental cost of running it.

    Behind every chatbot, image generator, and television streaming recommendation are massive banks of millions of computers housed in an increasing number of data centers that consume staggering amounts of electricity and water to keep their machines cool. Most of that electricity is still produced by fossil fuel-burning power plants, which contribute directly to air pollution and climate change.

    Mihri and Cengiz Ozkan

    A study from UC Riverside’s Marlan and Rosemary Bourns College of Engineering, however, proposes a solution to this growing problem. It outlines a method to dramatically reduce the pollution caused by AI processing in large data centers—while also extending the life of the hardware doing the work. No existing system combines these two goals, say the authors, professors Mihri Ozkan and Cengiz Ozkan. 

    While other strategies focus mainly on scheduling computing tasks when or where electricity is cleaner, the proposed system goes further. Called the Federated Carbon Intelligence, or FCI, it integrates environmental awareness with real-time assessments of the condition of the servers in use. The goal is not just to minimize carbon emissions but also to reduce the stress and wear and tear on the machines that generate the pollution.

    The researchers, who are married, backed their proposal with simulations. Their modeling showed that FCI could reduce carbon dioxide emissions by up to 45 percent over a five-year period. The system could also extend the operational life of a server fleet by 1.6 years.

    “Our results show that sustainability in AI cannot be achieved by focusing on clean energy alone,” said Mihri Ozkan, professor of electrical and computer engineering. “AI systems age, they heat up, and their efficiency changes over time—and these shifts have a measurable carbon cost.

    “By integrating real-time hardware health with carbon-intensity data, our framework learns how to route AI workloads in a way that cuts emissions while protecting the long-term reliability of the machines themselves.”

    By constantly monitoring the temperature, age, and physical wear of servers, FCI helps avoid overworking machines that are already stressed or nearing the end of their useful life. In doing so, it prevents costly breakdowns, reduces the need for energy and water-intensive cooling, and keeps servers running longer. 

    This approach recognizes that sustainability isn’t just about cleaner energy. It’s also about getting the most out of the hardware we already have, the authors say.

    Their system further accounts for the complete lifecycle carbon footprint of computing—especially the embodied emissions from manufacturing new servers. By keeping existing machines in service longer and distributing computing tasks in a way that balances performance, wear, and environmental impact, the system addresses both sides of the sustainability equation.

    “We reduce operational emissions in real time, but we also slow down hardware degradation,” said Cengiz Ozkan, professor of mechanical engineering. “By preventing unnecessary wear, we reduce not only the energy used today but also the environmental footprint of tomorrow’s hardware production.”

    FCI dynamically determines where and when to process AI tasks based on constantly updated data. It tracks the condition of the machines, gauges the carbon intensity of electricity at any given time and place, and evaluates the demands of each AI workload. Then, using that information, it makes real-time decisions to send the task to the server best suited to handle it—with the least impact on the planet and the machine.

    Deploying such systems—driven by AI models—could represent a major advancement for both environmental sustainability and the cloud computing industry, the researchers said.

    Establishing the adaptive framework would not require new equipment, just smarter coordination across the systems already in place, Mihri Ozkan said.

    Published in the journal MRS Energy and Sustainability, the study is titled “Federated carbon intelligence for sustainable AI: Real-time optimization across heterogeneous hardware fleets.”

    The researchers say the next step is partnering with cloud providers to test FCI in real data centers, a move that could lay the foundation for NetZero-aligned AI infrastructure worldwide. The Ozkans described an urgent need. The growing number of data centers is already consuming more power than entire countries, including Sweden.

    “AI is expanding faster than the energy systems that support it,” Cengiz Ozkan said. “Frameworks like ours show that climate-aligned computing is achievable—without sacrificing performance.” 

     

    Continue Reading

  • Dr. Lisa Su, AMD Chair and CEO, Elected Chair of Semiconductor Industry Association

    Dr. Lisa Su, AMD Chair and CEO, Elected Chair of Semiconductor Industry Association

    Thursday, Nov 20, 2025, 5:00pm

    by Semiconductor Industry Association

    WASHINGTON—November 20, 2025—The Semiconductor Industry Association (SIA) today announced AMD Chair and CEO Dr. Lisa Su has been elected Chair of the SIA Board of Directors. SIA represents 99% of the U.S. semiconductor industry by revenue and nearly two-thirds of non-U.S. chip firms.

    “We are delighted to welcome Dr. Lisa Su as SIA Chair during an exciting and consequential time for the semiconductor industry,” said SIA President and CEO John Neuffer. “Lisa has pushed the boundaries of semiconductor innovation for decades and is an extremely strong and influential leader in our industry. We look forward to her leadership in the year ahead as we push for policies that promote growth and innovation in the chip sector and keep America on top in this foundational, transformative technology.”

    Dr. Su brings more than 30 years of experience in the semiconductor industry. As Chair and CEO of AMD, she has led the company’s transformation into a global leader in high performance computing and a key supplier of advanced AI chips. Before assuming her current role, Dr. Su served as chief operating officer at AMD, where she unified AMD’s business units, sales, operations, and infrastructure into a single organization focused on execution and market impact. Prior to her roles at AMD, Dr. Su held leadership roles with Freescale Semiconductor (now NXP Semiconductors), IBM, and Texas Instruments. Dr. Su holds a PhD in electrical engineering from the Massachusetts Institute of Technology, and in 2020 received the Robert N. Noyce Award for her groundbreaking contributions to the semiconductor industry.

    “The semiconductor industry is at the heart of American innovation and essential to our economic growth and national security,” said Dr. Su. “It’s an honor to serve as Chair of SIA at such an important time. I look forward to working alongside my colleagues on the SIA Board of Directors to strengthen U.S. semiconductor competitiveness, extend our foundation for innovation, and build a stronger chip industry for many years to come.”

    # # #

    About SIA
    The Semiconductor Industry Association (SIA) is the voice of the semiconductor industry, one of America’s top export industries and a key driver of America’s economic strength, national security, and global competitiveness. SIA represents 99% of the U.S. semiconductor industry by revenue and nearly two-thirds of non-U.S. chip firms. Through this coalition, SIA seeks to strengthen leadership of semiconductor manufacturing, design, and research by working with Congress, the Administration, and key industry stakeholders around the world to encourage policies that fuel innovation, propel business, and drive international competition. Learn more at www.semiconductors.org.

    Continue Reading

  • Fundstrat’s Tom Lee says crypto is a ‘leading indicator’ for U.S. stock prices

    Fundstrat’s Tom Lee says crypto is a ‘leading indicator’ for U.S. stock prices

    Continue Reading

  • Carrier to Present at Goldman Sachs 2025 Industrials and Materials Conference

    Carrier to Present at Goldman Sachs 2025 Industrials and Materials Conference

    PALM BEACH GARDENS, Fla., Nov. 20, 2025 /PRNewswire/ — Carrier Global Corporation (NYSE: CARR) Chairman & CEO David Gitlin will speak at the Goldman Sachs Industrials and Materials Conference on Thursday, December 4, 2025, at 8:40 a.m. ET.

    The event will be broadcast live at ir.carrier.com. A webcast replay will be available on the website following the event.

    About Carrier  

    Carrier Global Corporation, global leader in intelligent climate and energy solutions, is committed to creating innovations that bring comfort, safety and sustainability to life. Through cutting-edge advancements in climate solutions such as temperature control, air quality and transportation, we improve lives, empower critical industries and ensure the safe transport of food, life-saving medicines and more. Since inventing modern air conditioning in 1902, we lead with purpose: enhancing the lives we live and the world we share. We continue to lead because of our world-class, inclusive workforce that puts the customer at the center of everything we do. For more information, visit corporate.carrier.com or follow Carrier on social media at @Carrier.

    Carrier. For the World We Share.

    Continue Reading

  • Legal AI startup draws new $50 million Blackstone investment, opens law firm – Reuters

    1. Legal AI startup draws new $50 million Blackstone investment, opens law firm  Reuters
    2. Norm Ai Raises $50M From Blackstone, Launches Law Firm  Law360
    3. Norm Ai Announces $50M Investment, Launch of New Law Firm  Law.com
    4. Norm Ai Raises $50M in Funding  FinSMEs
    5. Legal AI Firm Norm Ai Lands $50 Million Blackstone Investment  PYMNTS.com

    Continue Reading

  • UK initiative to explore using AI to address antimicrobial resistance

    UK initiative to explore using AI to address antimicrobial resistance

    Raycat / iStock

    Europe is seeing an increase in bloodstream infections (BSIs) caused by difficult-to-treat drug-resistant bacteria, according to data published this week by the European Centre for Disease Prevention and Control (ECDC).

    The data from the latest EARS-Net (European Antimicrobial Resistance Surveillance Network) report, which covers 30 European Union/European Economic Activity (EU/EEA) countries, show that the estimated total incidence of carbapenem-resistant Klebsiella pneumoniae BSIs rose by 61% from 2019 (the baseline year) through 2024, while the incidence of third-generation cephalosporin-resistant Escherichia coli BSIs increased by 5.9%. 

    The EU has set 2030 target reductions of 5% and 10% for the two pathogens, respectively, but ECDC says it appears unlikely those targets will be met.

    BSIs caused by other bug-drug combinations under EARS-Net surveillance also saw increases, including carbapenem-resistant E coli and vancomycin-resistant Enterococcus faecium. But one bright spot was that incidence of BSIs caused by methicillin-resistant Staphylococcus aureus fell by 20.4% from 2019 levels. As with prior EARS-Net reports, higher rates of antimicrobial resistance (AMR) were reported by countries in southern, central, and eastern Europe.

    Not just a medical issue

    The ECDC estimates AMR causes more than 35,000 deaths a year in EU/EEA countries. The organization attributes the rise in difficult-to-treat infections to an aging and vulnerable population with chronic health issues, cross-border transmission of resistant pathogens, persistent high antibiotic use combined with gaps in infection prevention and control, and a shortage of novel antibiotics.

    “Antimicrobial resistance is not just a medical issue—it’s a societal one,” Diamantas Plachouras, MD, PhD, head of the ECDC’s Antimicrobial Resistance and Healthcare-Associated Infections division, said in a press release. “We must ensure that no one in Europe is left without an effective treatment option.”

    Continue Reading