Blog

  • Yuki Bhambri progresses into maiden Grand Slam tennis semi-finals

    Yuki Bhambri progresses into maiden Grand Slam tennis semi-finals

    India’s Yuki Bhambri moved into his maiden Grand Slam semi-final after a hard-fought win in the men’s doubles quarters at the US Open 2025 on Wednesday.

    Bhambri and his partner, Michael Venus of New Zealand, the 14th seeds, beat the 11th-seeded team of Nikola Mektić of Croatia and the USA’s Rajeev Ram 6-3, 6(6)-7(8), 6-3 in a marathon match that lasted two hours and 39 minutes in New York.

    Both Mektić and Ram have won Grand Slams in the past, albeit with different partners. Mektić is also one half of the men’s doubles champions from Tokyo 2020 while Ram is a two-time Olympic silver medallist – from Rio 2016 and Paris 2024.

    Yuki Bhambri, India’s top-ranked men’s doubles tennis player at world No. 32, and Venus broke Mektić-Ram’s serve in the fourth game and consolidated to close out the opener.

    The second set saw Bhambri-Venus lose serve in the fourth game before breaking back in the seventh to force a tie-break.

    In the tie-break, the Indo-New Zealand duo failed to convert a match point at 6-5, allowing Mektić-Ram the opportunity to win the second set and take the contest into a decider.

    The final set was a tense affair with the Indian tennis player and his Kiwi partner getting a break point in the ninth game to set up an opportunity.

    However, they were made to work hard for the win, saving seven break points before finally converting their match point to move into the semi-finals.

    Bhambri’s charge to the US Open semi-finals doesn’t just mark his first appearance in New York’s last four but also the deepest run of his Grand Slam career.

    He and Michael Venus will take on the sixth-seeded duo of Neal Skupski and Joe Salisbury of Great Britain in the semi-finals on Thursday.

    Joe Salisbury won the US Open men’s doubles thrice in the past from 2021 to 2023, while Neal Skupski had reached the finals of the event in the 2022 edition of the tournament.

    Yuki Bhambri is the last remaining Indian challenge at this year’s US Open. Anirudh Chandrasekar and Vijay Sundar Prashanth’s run came to an end with a 6-4, 6-3 defeat to Brazil’s Fernando Romboli and Australia’s John-Patrick Smith.

    Earlier, Indian veteran Rohan Bopanna and his partner Romain Arneodo of Monaco crashed out in the opening round on Saturday. Arjun Kadhe and his partner, Diego Hidalgo of Ecuador, also made a first-round exit.

    Continue Reading

  • Bosch news in China – Gasgoo

    Bosch news in China – Gasgoo

    According to Gasgoo Automotive Research Institute‘s rankings of ADAS suppliers in H1 2025, China’s passenger vehicle ADAS component market showed a highly concentrated structure across several key domains, including air suspension system, LiDAR, driving-dedicated ADAS, forward-facing camera, APA solution, HD map, and high precision positioning system.

    At the same time, the strong rise of Chinese suppliers such as BYD, Huawei, and Sunny Smartlead is challenging the traditional dominance of international giants like Bosch and Valeo, highlighting the growing competitiveness of China’s supply chain in both technological innovation and cost efficiency.

    Top air suspension system suppliers

    KH Automotive Technologies: 165,139 sets installed, 37.1% market share

    Tuopu Group: 134,524 sets installed, 30.2% market share

    Baolong Automotive: 91,717 sets installed, 20.6% market share

    Vibracoustic: 40,040 sets installed, 9.0% market share

    Continental: 13,139 sets installed, 3.0% market share

    Others: 457 sets installed, 0.1% market share

    The air suspension system market showed a highly concentrated structure. The top 3 players—KH Automotive Technologies (37.1%), Tuopu Group (30.2%), and Baolong Automotive (20.6%)—collectively secured 87.9% of the market share. KH Automotive Technologies led the market with 165,139 sets, while Baolong Automotive ranked third with 91,717 sets. The landscape reflected a clear tilt toward leading China’s local suppliers, highlighting their rise in critical technologies and underscoring the strong competitiveness of the local supply chain. Overall, the cost and technology iteration advantages of Chinese brands are expected to further accelerate market consolidation.

    Top LiDAR suppliers

    Huawei Technologies: 400,456 units installed, 40.0% market share

    Hesai Technology: 284,399 units installed, 28.4% market share

    RoboSense: 236,501 units installed, 23.6% market share

    Seyond: 80,570 units installed, 8.0% market share

    Others: 61 units installed, 0.01% market share

    Rankings of ADAS component suppliers in China (H1 2025): Market concentration remains high across multiple segments

    Huawei Technologies, Hesai Technology, and RoboSense together captured 92% of China’s LiDAR market, highlighting the sector’s high concentration. With the top 4 suppliers all being Chinese players, the supply chain advanced rapidly—shifting from early breakthroughs to establishing market dominance in core intelligent sensor technologies.

    The ranking also included blind-spot LiDAR. In terms of primary LiDAR installations, Hesai Technology ranked first.

    Top driving-dedicated ADAS suppliers

    Bosch: 1,067,448 sets installed, 15.3% market share

    BYD: 856,933 sets installed, 12.3% market share

    DENSO: 617,441 sets installed, 8.8% market share

    ZF: 491,731 sets installed, 7.0% market share

    Freetech: 410,220 sets installed, 5.9% market share

    Veoneer: 295,069 sets installed, 4.2% market share

    Huawei: 290,548 sets installed, 4.2% market share

    Valeo: 280,866 sets installed, 4.0% market share

    Tesla: 264,907 sets installed, 3.8% market share

    Aptiv: 243,909 sets installed, 3.5% market share

    Rankings of ADAS component suppliers in China (H1 2025): Market concentration remains high across multiple segments

    According to the rankings above, the market showed a clear concentration at the top. Bosch took the lead strongly with 1,067,448 sets (15.3% market share), followed by BYD with 856,933 sets (12.3%), together accounting for nearly 30% of the market. Notably, Huawei entered the top 7 with 290,548 sets (4.2%), highlighting the rapid penetration of tech companies into core ADAS hardware. Overall, while global giants still dominate, Chinese players like BYD and Huawei are gradually reshaping the supply chain landscape.

    Top forward-facing camera suppliers

    Bosch: 1,067,824 sets installed, 15.3% market share

    DENSO: 627,297 sets installed, 9.0% market share

    Sunny Smartlead: 609,187 sets installed, 8.7% market share

    BYD Semiconductor: 594,806 sets installed, 8.5% market share

    ZF: 501,398 sets installed, 7.2% market share

    Freetech: 408,786 sets installed, 5.8% market share

    Baolong: 381,329 sets installed, 5.5% market share

    Veoneer: 329,915 sets installed, 4.7% market share

    Valeo: 269,873 sets installed, 3.9% market share

    Tesla: 264,907 sets installed, 3.8% market share

    Rankings of ADAS component suppliers in China (H1 2025): Market concentration remains high across multiple segments

    Bosch remained the leader in forward-facing camera installations with 1,067,824 sets (15.3% share). China’s local players like Sunny Smartlead (609,187 sets, 8.7%) and BYD Semiconductor (594,806 sets, 8.5%) showed strong growth, underlining the rising competitiveness of Chinese suppliers in core technologies. The top 5 suppliers together accounted for 40.7% of the market, indicating a moderately high level of concentration. The rapid ascent of local manufacturers is reshaping the supply chain, driven by technological innovation, cost efficiency, and localized service capabilities, while also pushing forward-facing cameras toward becoming standard rather than optional features in intelligent vehicle systems.

    Top APA solution suppliers

    BYD: 701,767 sets installed, 19.6% market share

    Bosch: 494,825 sets installed, 13.8% market share

    Valeo: 405,750 sets installed, 11.3% market share

    Huawei: 290,548 sets installed, 8.1% market share

    Li Auto: 208,314 sets installed, 5.8% market share

    XPENG: 178,812 sets installed, 5.0% market share

    TungThih Electronic: 167,635 sets installed, 4.7% market share

    Xiaomi: 158,104 sets installed, 4.4% market share

    Momenta: 128,806 sets installed, 3.6% market share

    Leapmotor: 119,739 sets installed, 3.3% market share

    Rankings of ADAS component suppliers in China (H1 2025): Market concentration remains high across multiple segments

    BYD topped the APA solution market with 701,767 sets installed (19.6% share). International giants Bosch (494,825 sets, 13.8%) and Valeo (405,750 sets, 11.3%) followed, yet local players are rapidly gaining ground. Among the top 5, BYD, Huawei, Li Auto, and XPENG together held over 45.8% of the market. The strong performance of Chinese suppliers in core APA technologies is gradually shifting the supply chain landscape and accelerating the move toward making automated parking a standard feature in new energy vehicles (NEVs). Despite mounting competition and potential pricing pressure, growth driven by Chinese players is set to reinforce the strength and competitiveness of China’s automotive industry.

    Top HD map suppliers

    AutoNavi: 522,694 sets installed, 55.2% market share

    Tencent: 116,922 sets installed, 12.3% market share

    Langge Technology: 107,209 sets installed, 11.3% market share

    NavInfo: 72,138 sets installed, 7.6% market share

    Others: 127,996 sets installed, 13.5% market share

    Rankings of ADAS component suppliers in China (H1 2025): Market concentration remains high across multiple segments

    AutoNavi led the HD map market with 522,694 sets installed, holding a 55.2% share, due to its strong data resources, OEM partnerships, and ongoing technology upgrades. Tencent (116,922 sets, 12.3%), Langge Technology (107,209 sets, 11.3%), and NavInfo (72,138 sets, 7.6%) formed the second tier, together accounting for 31.2% of the market. While smaller suppliers still have room to grow in niche areas, the market is now dominated by the top players, with the four largest controlling 86.4% of installations. This concentration is speeding up technology standardization and reinforcing the leading suppliers’ position, turning HD maps into essential infrastructure for intelligent vehicles rather than just optional features.

    Top suppliers of high precision positioning system

    ASENSING: 1,135,021 sets installed, 56.7% market share

    Huawei: 290,263 sets installed, 14.5% market share

    CHC Navigation: 158,388 sets installed, 7.9% market share

    XPENG: 100,220 sets installed, 5.0% market share

    Qianxun SI: 21,434 sets installed, 1.1% market share

    Others: 295,676 sets installed, 14.8% market share

    Rankings of ADAS component suppliers in China (H1 2025): Market concentration remains high across multiple segments

    ASENSING dominated the high precision positioning system market with 1,135,021 sets installed (56.7% share), highlighting its strong capabilities in multi-sensor fusion technologies. Huawei followed with 290,263 sets (14.5% share). Notably, the top 3 suppliers—ASENSING, Huawei, and CHC Navigation—together secured 79.1% of the market, reflecting a highly concentrated landscape. Looking ahead, as automakers demand higher positioning accuracy and cost pressures intensify, suppliers with advanced multi-sensor fusion algorithms and automotive-grade mass production experience are expected to further squeeze smaller competitors.

    Continue Reading

  • Neurologist’s responds to 35-year-old man wanting him to ‘prescribe aspirin to prevent stroke’ after father’s paralysis | Health

    Neurologist’s responds to 35-year-old man wanting him to ‘prescribe aspirin to prevent stroke’ after father’s paralysis | Health

    Dr Sudhir Kumar, a neurologist, took to X on Jun 10, 2023, to share his prescription for a patient ‘who wanted advice regarding starting an aspirin pill,as his father had suffered from a stroke (recently) at age 60’. In the accompanying tweet, Dr Kumar shared details of the man’s case and what he actually prescribed him instead of aspirin. Also read | Neurosurgeon explains how to recognise a brain stroke: Most common warning signs, symptoms and what to do immediately

    Aspirin works by inhibiting the production of certain natural substances that cause fever, swelling, and blood clots, which can make it useful for stroke prevention. (Pixabay)

    ‘Instead of one pill, I prescribed ‘6 pills’

    He said, “A 35-year-old consulted me today, as he wanted me to prescribe an aspirin pill to prevent a stroke. His father, aged 60, had recently suffered from stroke (paralysis), and he was concerned about his higher risk of getting a stroke in future. Instead of one pill (aspirin), I prescribed ‘6 pills’ (mentioned in the recommendations section of my prescription).”

    So what did Dr Kumar actually prescribe to the man, who at the time, ‘weighed 80 kg, had a BMI (body mass index) of 26.2 with mildly elevated total and LDL cholesterol, normal homocysteine and cardiac evaluation’?

    7-8 hours of sleep to 9-10K steps a day

    As per the prescription he shared on X, Dr Kumar advised the man to follow these habits and come back for a ‘review after three months’:

    1. Regular sleep: 7-8 hours a night.

    2. Brisk walking or running: 30-40 minutes a day. Aim for 9-10K steps per day.

    3. Healthy diet: Avoid soft drinks, sugar, and ultra-processed packaged foods. Reduce carb intake and increase fruits (within limits), vegetables, and nuts (a handful/day), poultry, fish, and eggs.

    4. Reduce working hours: from current 13-14 hours to 8-9 hours.

    5. Reduce stress.

    6. Complete abstinence from alcohol.

    Can aspirin prevent a stroke?

    Aspirin is in a group of medications called salicylates. It works by stopping the production of certain natural substances that cause fever, pain, swelling, and blood clots. Sharing details of daily aspirin therapy, Mayo Clinic said that taking an aspirin a day can be a lifesaving option and may lower the risk of heart attack and stroke, but it’s not for everyone.

    Per Mayo Clinic, daily aspirin therapy may be used in two ways:

    ⦿ Primary prevention

    This means that you’ve never had a heart attack or stroke. You’ve never had coronary bypass surgery or coronary angioplasty with stent placement. You’ve never had blocked arteries in your neck, legs or other parts of the body. But you take a daily aspirin to prevent such heart events. The benefit of aspirin for this use has been debated.

    ⦿ Secondary prevention

    This means that you had a heart attack or stroke, or you have known heart or blood vessel disease. You’re taking a daily aspirin to prevent a heart attack or stroke. The benefit of daily aspirin therapy in this situation is well established.

    Note to readers: This report is based on user-generated content from social media. HT.com has not independently verified the claims and does not endorse them.

    This article is for informational purposes only and not a substitute for professional medical advice.

    Continue Reading

  • Punjab braces for “super flood” as Ravi-Chenab converge; threatening Multan, Muzaffargarh

    Punjab braces for “super flood” as Ravi-Chenab converge; threatening Multan, Muzaffargarh



    ANI |
    Updated:
    Sep 04, 2025 13:05 IST

    Multan [Pakistan], September 4 (ANI): Provincial authorities in Punjab on Wednesday braced for “super flood” as the convergence of the swollen Ravi and Chenab rivers near Khanewal threatened the districts of Multan and Muzaffargarh, Dawn reported. Officials warned of a “dual threat” despite several controlled breaches over the past week.
    The water level at Muhammadwala and Sher Shah was recorded at 412 feet, only five feet below the critical level. Authorities termed the next 12 hours as critical, with pressure at the breaching points increasing after the convergence of the Ravi and Chenab rivers near Khanewal, Dawn added.
    To protect urban centres along the eastern rivers, the Punjab government has been conducting controlled breaches to relieve pressure on barrages and main embankments, safeguarding densely populated cities. A decision on whether to conduct breaches at Head Muhammadwala, Sher Shah Flood Bund, and Rangpur is expected within hours, with 17 points identified to save Multan and Muzaffargarh.
    The situation is compounded by an enormous surge of approximately 550,000 cusecs that crossed the Marala and Khanki Headworks, recorded passing through Qadirabad Headworks at 530,000 cusecs. Officials projected the surge would reach Trimmu Headworks on Thursday and Multan by Friday, Dawn reported.
    “The next 12 hours are extremely critical. We are facing a dual threat: the existing high water from the confluence of the rivers and a new, massive wave heading directly for us. All resources are being mobilised,” a Provincial Disaster Management Authority (PDMA) official said.
    Railway traffic to Karachi and vice versa was suspended from Faisalabad after a bridge on the Chenab River in Abdul Hakim came under water.
    In a press conference on Wednesday evening, PDMA Director General Irfan Ali Kathia said the flood crisis was set to intensify as all three major Indian dams were expected to reach maximum capacity within 72 hours, worsening an already catastrophic situation in the Punjab river system. “The next 72 hours are critical,” he warned, Dawn reported.

    He confirmed that while the Chenab River’s water level was currently stable, previously affected districts were likely to face renewed flooding. The Sutlej River has remained in flood-like conditions for two months, while the Ravi River showed rising levels at the Jassar monitoring point.
    “Thein Dam is already full and will continue to release water into the Ravi for the next two to three weeks. While the situation in Ravi will not be as severe as before, water levels will definitely increase,” DG Kathia said.
    In an alarming development, the DG explained that instead of merging with the Chenab as expected, the Ravi’s waters are flowing backwards, preventing a decrease in water levels. “Until water levels decrease at Ahmadpur Sial, we will not see any reduction at Sidhnai,” he clarified.
    The press conference followed a personal assessment of the critical Head Muhammadwala site by Punjab CM Maryam Nawaz. DG Kathia said only four to five feet of capacity remained before reaching critical levels. “At Sher Shah Bridge in Multan, there’s significant water pressure with only a two-foot margin remaining. Important decisions regarding a controlled breaching in Multan have already been made to prevent uncontrolled overflow,” he said, Dawn reported.
    Over 3,900 villages and a population exceeding 3.7 million have been affected across Punjab. The death toll has risen to 46, while over 1.4 million residents and one million animals have been relocated to safer locations, Dawn added.
    Relief efforts include 409 flood camps providing essential facilities to around 25,000 displaced persons. In Khanewal and Toba Tek Singh, 136 and 75 villages, respectively, have already been affected, with numbers expected to rise due to renewed surges.
    CM Maryam Nawaz visited flood relief camps in Multan, directing district administrations to assess the damage, ensure clean drinking water, and conduct fumigation and dry germicidal sprays in flood relief camps and tent cities.
    As of 11 pm on Wednesday, Marala Headworks on the Chenab reported a flow of 444,754 cusecs, falling, while Khanki and Qadirabad Headworks held steady with flows of 558,683 and 557,440 cusecs, respectively. Other key monitoring points, including Chiniot Bridge, Head Muhammadwala, Rivaz Bridge, and Trimmu Headworks, showed varying rising or steady trends.
    For the Ravi River, upstream at Jassar, water flow was receding at 82,140 cusecs, while downstream points such as Ravi Syphon and Shahdara showed rising levels. The Sutlej River system remained stable across all monitoring stations, including GS Wala, Sulemanki, Islam, Panjnad Headworks, and Malsi Syphon. (ANI)


    Continue Reading

  • Security Requirements And Penalties Grow For Chipmakers

    Security Requirements And Penalties Grow For Chipmakers

    Governments and systems companies are fundamentally changing the rules around semiconductor security, forcing chipmakers and their suppliers to comply with tough new regulations that require resiliency in hardware. Unlike in the past, chips and systems deployed in these markets must be able to respond to threats rather than waiting for the next version of a chip or IP to address vulnerabilities.

    Attached to these regulations are costly penalties, which can range from enormous fines to being frozen out of lucrative markets. And while they vary by region and by market segment, collectively they almost certainly will increase the cost of doing business across the semiconductor supply chain. Underlying these new rules is a recognition that data is becoming more valuable, accessible through nearly everything with a battery or a plug, and the chips that enable the processing, storage, and movement of that data are becoming a bigger target for bad actors looking to steal or control it.

    As a result, hardware vendors need to begin thinking about security at every step of the design-through-manufacturing process, both individually and in the context of larger systems. At risk are different processor types and memories, chiplets, the interconnects between those chiplets, soft IP, and packaging. In the past, these components were generally developed according to a specification, with security layered on top or around those components. In the future, that won’t be enough. Companies may be held liable or barred from doing business if they don’t comply with the new regulations, regardless of whether they meet those specifications.

    Four main sets of regulations affect chipmakers and their suppliers, and all of them are becoming more stringent:

    • The new version of the European Union’s Cyber Resilience Act (CRA) will impose fines of up to 2.5% of a company’s total revenue, or up to €15 million — whichever is greater — for failure to comply with regulations involving hardware and software. This applies to manufacturers, distributors, importers, or anyone else in the supply chain, and it will ban any products or services that do not comply, effective Dec. 11, 2027.
    • OCP SAFE, which is being driven by Google and Microsoft, shifts the security burden to makers of processors and peripherals with updatable software. The goal, as defined by the Open Compute Platform, is to ensure the “provenance, code quality, and software supply chain” of firmware releases and patches, essentially putting the onus on third parties rather than the systems companies to fix security issues for whatever they sell.
    • The U.S. National Cybersecurity Strategy Implementation Plan (NCSIP) likewise emphasizes resiliency over static security measures in a far-reaching plan that requires companies working with the U.S. government to support a broad framework of requirements, under the aegis of the National Institute of Standards and Technology (NIST).
    • OCP also has developed a tamper-resistant root-of-trust specification called Caliptra for data center hardware. Supported by Microsoft, Google, AMD, and NVIDIA, it’s aimed at bringing security into heterogeneous systems that may include CPUs, GPUs, NPUs, TPUs, DPUs, and various controller chips.

    While not every company or market segment is directly affected, resiliency will be viewed as an advantage in most markets, and companies that do business in these regulated markets will leverage security as a competitive edge.

    “In the past, security could have been something that’s one and done,” said Maarten Bron, managing director at Keysight Technologies. “You’d launch a product into the market, and it had a certain shelf life, which for IoT devices could be very short. But now, all of a sudden, there is a requirement to keep your finger on the pulse because it’s not static. What’s secure today could be insecure tomorrow, and there’s a requirement not just to notify your customers that something may be wrong, but to fix it. That means a firmware update capability needs to be part of the product, and there has to be a secure way of commissioning those firmware updates.”

    Security experts have been warning about a widening attack surface that includes semidconductors since the early 2000s, but until recently most of the attention has been focused on software. “I’ve been railing for years that if companies didn’t provide secure products, or take the need to protect their products seriously, that eventually they would be boxed in with legislation,” said Mike Borza, principal security technologist and scientist at Synopsys. “Some places want to deregulate, but others will insist there’s security in these products so that it’s not possible for adversaries to continue romping through them and taking control of them at will. The European legislation is serious.”

    Unanswered questions
    These new regulations are just the beginning, and it may take years to sort out exactly how and where legislation will be applied to security. But the underlying concepts are well understood. New vulnerabilities in hardware are sprouting up faster than new chips or systems can be developed to close them. As a result, designs need to include some method of isolating attacks or updating firmware to prevent attackers from holding data hostage or controlling the processing, storage, or movement of data.

    The big question is how to implement hardware security in a cost-effective way. “The scope and purview of these requirements is critical infrastructure, data centers, and those kinds of areas,” said Erik Wood, senior director of product security at Infineon. “If we deliver these standards of care that are well-known, well-adopted, and all the standards bodies, certification labs, and industry organizations are saying, ‘Yeah, that’s the right standard of care for these devices,’ that’s important. But remember, some of these are $2 and $3 devices, where 5% is what the market will bear in security costs. As we’ve seen recently in radio equipment, the outcome has turned into a technical requirement. There’s a harmonized standard with EN 18031 (Europe) as the benchmark radio equipment directive, and with CRA, we’re expecting it to be a harmonized standard that allows us to benchmark our hardware and software against something. But as that does not exist yet, we’re still in the standard of care phase.”

    That standard of care is basically a recipe of best practices, similar to how the software industry has addressed security in the past. When vulnerabilities are found, security patches are developed. But patching hardware can be a lot more difficult than patching software, and the results are not always optimal.

    “With Spectre and Meltdown, we had to provide patches, and everyone at that point seemed to be mandated to provide those patches,” said Nandan Nayampally, chief commercial officer at Baya Systems. “Because they were bypassing caching, the only way to undo that was to kind of flush it at times. It took a performance hit to the tune of 20%.”

    Whether these regulations will work for semiconductors as well as patches for software remains to be seen. Chips are becoming increasingly complicated, particularly at the leading edge of designs where a monolithic planar SoC is being decomposed into chiplets. These are predominantly bespoke multi-die assemblies targeted at specific markets, and they are at the bleeding edge of performance and power management. Adding active security — or more accurately, the ability to update security to address known vulnerabilities — will be unique to each design. Moreover, security takes on a whole new challenge as chips begin to age. Electromigration, time-dependent dielectric breakdown, and functional updates can open entirely new vulnerabilities that were not there in the first place, particularly in AI data centers, where utilization of resources is higher than in other markets.

    “On the one hand, I’m glad standards are being established for systems where security critically informs, like user safety in an automobile,” said Scott Best, senior director for silicon security products at Rambus. “Standards groups are documenting and saying, ‘These are the requirements. If you’re going to deliver security into this system, it needs to achieve this level of performance in a security sense.’ That’s a good thing because safety is critical. “

    But exactly what’s required isn’t entirely clear at this point. “It’s a work in progress, because the EU still has to publish the detailed testing requirements,” said Keysight’s Bron. “There’s a possibility that the introduction will be delayed a little bit because of pushback, but we don’t expect it to be canceled. We think CRA is there for a good reason. It’s going to protect everybody. And yes, it will have an effect on prices. But the current situation is not sustainable with so many attacks going on, so much ransomware, and so many state actors abusing stuff. Still, this won’t be an easy step.”

    A long liability chain
    One of the big changes coming to hardware security involves an increase in the number of suppliers on a bill of materials. While liability generally starts with the system vendor, which acts as the general contractor, it ripples down from there. Think about a multi-die assembly, for example, in which there are a variety of chiplets, interconnects, memories, and embedded IPs. Was there a latent defect somewhere in the device? Did a particular wafer degrade faster than another wafer due to a buildup of residue in a chamber or deeper dishing from chemical mechanical polishing? Was the photoresist contaminated by impurities in one of the materials? Was there silent data corruption due to a design or manufacturing error?

    “The whole idea of UCIe is that it leads to a marketplace of chiplets, where you can put together a heterogeneous system and package based on supplies of chiplets from multiple vendors,” said Synopsys’ Borza. “That creates the possibility that people are going to pass the buck. First, the vendor of the system and package ends up being identified as the source of a problem because there’s a vulnerability. Then, they will start trying to isolate that within their system and package. That means they’ll be looking at the individual vulnerable components of that system and package to determine where the vulnerability stems from, what the root cause is, and whether there’s somebody liable in there.”

    This will hit some industries harder than others. “Carmakers currently have functionally safe processes,” said Andy Heinig, head of department for efficient electronics in Fraunhofer IIS’ Engineering of Adaptive Systems Division. “For them, it’s only a small effort. It’s a new feature they have to check, but they have very well-established processes for quality. Companies designing for industry, or for products with a lifetime of 15 years or so, are well-equipped with processes for that. But companies that have never looked into that are really in trouble. These may be companies that develop Bluetooth or Zigbee chips for consumer applications that are very easy to penetrate.”

    Which parties are ultimately liable, and for how much, has yet to be worked out. It’s also not clear how much any of this is going to cost to implement, and who ultimately will pay for that.  How companies tackle resiliency can vary greatly, from a secure system that does firmware updates to simple firewalling of critical data.

    “We’ve always had features for security isolation,” said Charlie Janac, CEO of Arteris. “You can have a secure and non-secure section, and you can poison data that are not allowed to be on that trace. So if we detect packets that are not supposed to be on that particular connection, we can poison them and kill them. Another approach involves functional safety, where if you detect an error and you have duplicated units that are comparing each other right at the interface of the network, then you can identify packet errors and shut down the IP that’s generating the errors. And we can analyze whether one IP is generating too many errors and take it off the network. Right now, the biggest chips are about 500 IPs. We can put firewalls on the interconnects, which are specialized IPs that detect security problems with data flows going through the trace.”

    Nevertheless, even fully understanding what is required at any time is a challenge, given the growing number of security organizations and updates.

    “The main challenge people have had is that nothing is designed perfectly, so now we’ve given them a patch,” Baya Systems’ Nayampally said. “So what are they going to do with it? That has implications for most contracts in terms of increasing the level of liability and indemnity. Best practices will come out. But, of course, there will be other standards bodies competing.”

    Others point to similar potential for confusion. “There are updates to the specifications that are difficult to track, and all the overhead in figuring out which one your customers must have,” said Rambus’ Best. “And sometimes the standards groups start moving in a direction where the people who are contributing to the specification are not necessarily from businesses that are security-based. They’re not delivering security components into the automotive space, but they’re contributing to some degree to these automotive certifications and the standards requirements going into them.”

    That tends to favor large players over startups and small companies, because it requires a compliance team to stay on top of these updates, and a legal team to defend against any fines.

    “Only a large organization can afford to put 5 to 10 people on the certification compliance team to track all of these standards, attend all the meetings, and make sure their products are conforming to these rapidly changing standards,” Best said. “Rambus is large enough to pay that tax. We have a compliance certification team of experts that has now gone through the process to say, ‘Here’s where are products are certified now.’ But we have smaller competitors that can’t afford that level of resources.”

    It will take time to sort out exactly what is required and what is an acceptable solution. “It’s not really defined in all these specifications and documents how much you have to do,” said Fraunhofer’s Heinig. “This is similar to the functional safety discussion in automotive that we had with ISO 26262 early on. It’s really hard for all the partners to understand what it means. It took three or four years to understand which document you have to fill out, what is enough for that, what is necessary, or what is missing. This is the critical phase because nobody knows, if it goes into law, if you will need more.”

    Past, present, and future
    Wherever hardware resiliency exists today is where the perceived value of data is high enough to warrant the investment. Set-top boxes, credit cards, and automotive and industrial applications are prime examples.

    “The roots of security certification started in payments,” said Marc Witteman, senior director of device security testing at Keysight. “There’s a global organization called EMVCo. The E stands for Europay, the M for Mastercard, and the V for Visa. They set up a security certification for payment, and they also pioneered the trust models. What’s interesting is they also defined the so-called composition model, which starts with the chip. On top of the chip is an operating system, and on top of that are payment applications. So there could be a lot of combinations of chips and operating systems. It’s a composition model, and you add layers on layers on layers. Now we’re seeing that model repeating in other industries, but it’s a lot more fragmented and diverse. But if you think about the BOM and the SBOM (software bill of materials), these are relatively new concepts. People want to know what hardware and software is in this device. Then you can build a chain of trust, starting with the root of trust in the hardware.”

    That chain of trust is expanding. Chipmakers already have begun compliance efforts and are designing new chips and systems that include more security features and update options.

    “And security engineers who are creating the data center security based on CNSA 2.0, which starts kicking in this year, share office space with engineers at those same companies contributing to the security requirements of medical devices and wearables, thermostats, and stuff like that,” said Infineon’s Wood. “So there’s cross-pollination of the critical infrastructure requirements from CNSA 2.0 that, after a year or two, lead into IoT manufacturing and those requirements.”

    How quickly the rest of the industry will follow remains to be seen. But no matter how rocky the start, changes are coming to hardware security. Something fundamental has changed, and if regulators continue to push, hardware patches soon may become just as prevalent and persistent as software updates.

    Continue Reading

  • Cloud vs. Edge Gaming: Performance Gap Is Shrinking

    Cloud vs. Edge Gaming: Performance Gap Is Shrinking

    Chip designers and gaming companies are scrambling to figure out whether the gaming market will tilt toward the cloud, the edge, or some combination of both. Multi-gigabit internet allows more people to play high-end games in the cloud, but edge-based gaming consoles and devices remain well-rooted, more secure, and private.

    Which one wins? So far, there are more questions than answers. Handheld devices and phones offer basic games online, and they are the most popular way people access games globally today. Edge-based consoles and dedicated devices, meanwhile, provide much more realistic action and better graphics. But there is overlap between these two worlds, as well, blurring what previously were rigid dividing lines.

    “Gaming has historically worked on edge,” said Tyrran Ferguson, director of product and strategic partnerships at Imagination Technologies. “That’s how the graphics are being rendered on the device, in your hand, or in your computer, and the compute is happening there too. Cloud gaming is slowly removing it from the edge and moving it to the cloud. Maybe we see a future where the local GPUs are less powerful, running AI workloads locally or compute workloads, and the graphics are all on the cloud or whatever it may be. But the way it has worked up until recently is only edge gaming if you want to call it that — there’s no real term for it.”

    One key difference is that there is more collaboration in cloud gaming. “This means that the gaming software isn’t just working with one individual,” said Sathishkumar Balasubramanian, head of products for IC verification and EDA AI at Siemens EDA. “You can think about a gaming ecosystem covering millions of users pinging it, and making sure that they all interact with each other, so the problem changes. The edge becomes more of a client interface, where you have a front end and some edge processing — your tactile inputs. But most of the processing gets done in the cloud, because the entire gaming ecosystem resides in the cloud.”

    Yet even though more game processing is happening in the cloud, more intelligence is moving to the edge. “For both these use cases, the chips, OSes, and loads are different,” said Balasubramanian. “On the gaming side you need to provide graphics, because it’s got a display. That means the chip needs to be smart enough, and the driver should be smart enough and powerful enough to drive that resolution. It needs to be more responsive because you’re doing real-time with what you press. It needs to have enough data. Some of the simple things in the gaming system need to be done on the edge, so the processor load is different. Based on that, the chip complexity and performance that’s needed is changing.”

    Above all, there needs to be stability and correct functionality, as well as high performance, especially for the graphics processing. “We need to run fundamental functional verification. When I turn this switch, does the light go on? But the key for gaming is, how long does that take?” said Matthew Graham, senior group director, verification software product management at Cadence. “People don’t want to buy the latest gaming GPU and find their favorite game doesn’t run on it, or it doesn’t run on the right OS, whether it is MacOS, Windows, Android, iOS, or Linux that some consoles run on. The important thing is the combination of the hardware and the software. We can’t yet run the games that they want to run pre-silicon, but we can certainly run the game workloads, analyze full HD, 4K frames, and how they’re processed by the various tools and so on.”

    Because of the need for real-time processing, the edge will always have a role to play. “If you put everything in the cloud, you lose time,” said Balasubramanian. “You want to make sure things are very fast, more redundant, instantaneous in terms of making decisions. With intelligence coming into these edge devices where you’ve got to make decisions, there’s a lot more processing needed at the edge. I’m not talking about an LLM. I’m talking about a domain-specific language model.”

    AI and language models are a key challenge. “Edge computing pushes the limits of latency, bandwidth, and power efficiency — especially as AI workloads move closer to the user,” said Steven Woo, fellow and distinguished inventor at Rambus. “The challenge is delivering real-time responsiveness while managing thermal and energy constraints in compact, distributed environments that may also be battery-powered. AI models need to be smaller than their data center counterparts, but must maintain good accuracy and efficiency to be effective.”

    In the gaming and AI space, the fast pace of change makes chip design even harder. “We’re building hardware today that’s going to be in devices in three to five years,” said Anand Patel, senior director of product management for GPUs in the client line of business at Arm. “On the gaming side, it’s our relentless push for delivering more and more performance year on year. People talk about Moore’s Law and the gains diminishing, but we’re not seeing that play out in reality. There’s a continued push on how much performance and efficiency we can get out of the chip, and double-digit gains every year. Advanced nodes are absolutely a tool. There are things we’re having to do at the system level, as well, beyond the GPU. The CPU has to deliver overall performance, then stitch all this together in the board for the memory system (typically LPDDR), memory bandwidth, and new types of memory technology — with storage, since these models can be quite large, combined with the fact that we have to store them in flash — all of this needs to move along together to ensure that we can keep up with those games.”

    Gamers’ pain is latency
    In the gaming world, latency remains the biggest concern. “What is fascinating is that cloud models are getting so powerful that the latency is acceptable,” said Dave Garrett, vice president of technology and innovation at Synaptics. “I always think it’s going to be a split compute model, with an explosion of very compact data on the edge, where we’ll render the hardest stuff on the edge, see what’s the minimum that you need to communicate between these two nodes, and that’s going to give you the latency and the speed. Lag is the thing that gamers hate the most. If we go to the heart of it, what’s the customer’s experience? They don’t care that you’re using AI. They care if the game is rendering. Is it fast? Is it glitching? That’s why the split will be permanently putting you halfway between the two domains.”

    One common application for split compute is massive multiplayer online games, where the server coordinates and aggregates the different gamers while the graphics are still rendered locally. “Technology is always a chicken-and-egg,” said Garrett. “Let’s say I get to the point where the cloud can render the whole game. Somebody’s going to go and invent a game that needs more or does more. It’s a weapons race, so you’re probably permanently in this edge/cloud split because someone’s going to find a more interesting game in the future that requires a massive next level of compute.”

    Multiplayer games are essentially a physics simulation. “The console manipulates that physical model by putting the users’ inputs into it, and then manipulates the user by firing back whatever is going on in the network around them as they’re busy losing their game,” said Mike Borza, principal security technologist and scientist at Synopsys. “All of that data, all of the storage for it, exists as part of a massive server infrastructure. A lot of times, there are physics simulations for aspects of the game. Other things get rendered differently, and then the model of the world the player is interacting with gets downloaded to their console or their gaming platform. That’s how they get to see their view of the world, which is a space and time subset of the overall game. They get the piece that’s geographically near them in the timeframe in which they’re playing this game.”

    More games on more devices
    Weighing the benefits of cloud versus edge is more than idle speculation for chipmakers and providers of gaming technology. The global games market revenue is projected to reach $522.46 billion in 2025 and grow at a 7% CAGR to $733.22 billion by 2030. Today, there are an estimated 2.2 billion users.[1] In fact, gaming is so big that Netflix’s next competitor for eyeballs on screens wasn’t other TV and movie streaming services.

    “It was the gamers,” said Imagination’s Ferguson. “So, what did they do? They bought a bunch of studios, and now you’ve got Netflix games. They spent who knows how much money getting that up and running, because they know that gaming is such a massive industry.”

    The general trend is a consolidation of game engines, like Epic Unity and Unreal Engine, across desktop and PC, console, and mobile.

    “Developers don’t want to target different devices with different types of games. We’re seeing more desktop-like content running on mobile,” said Arm’s Patel. “Both gaming companies and mobile companies need to solve the problem of how to make desktop games work on mobile. We need to engage developers and the engine vendors to make it easy for them to move this stuff. You can’t just take a game running on desktop and run it on mobile. There are bits you need to change or adapt for it to be mobile or battery-powered device-friendly.”

    While advancements in GPU architecture are expected to enhance performance across all platforms, consoles and PCs will consistently outperform handheld devices in terms of raw capability due to fewer power and area limitations, said Amol Borkar, director of product management and marketing for Tensilica DSPs in the Silicon Solutions Group at Cadence. “Despite this, the latest generation of handheld gaming devices presents a distinctive combination of portability and robust performance, closely resembling the console experience. For instance, the Xbox Ally ROG enables users to play their preferred games on the move while seamlessly integrating with consoles on the same network using an optimized screen mirroring capability. This handheld system utilizes mobile-optimized processors, effective thermal management, and dynamic resolution scaling to deliver smooth gameplay without requiring extensive cooling solutions.”

    Battery-powered mobile gaming is gaining ground, but many gamers still prefer a plug. “Gaming, in many cases, is still wired,” said Michal Siwinski, chief marketing officer at Arteris. “My boys were using mobile devices, but the moment they could afford a plugged-in device with a full NVIDIA GPU rack, they did, because the performance is so much better. Gaming is getting so advanced, and the graphics are getting so sophisticated and so awesome, that you have to plug it in. It’s not a data center. It is an edge device. But it’s a wired edge versus wireless edge.”

    Wired versus wireless gaming
    Wired gaming devices have different challenges compared to wireless. “In a wired edge application, it’s not about the battery, but it’s about energy,” said Siwinski. “The problem there is you want to have the highest bandwidth, highest performance, the most compute you can, to get the best graphics. That means you’re probably going to go to the most advanced nodes to get the best density, such as TSMC 2nm or Intel 18A. The problem in the advanced nodes is that the wires are so small, and as you’re computing all of this massive machine learning stuff, you have thousands or millions of wires conveying one signal. You have billions of connection points, all of them very tiny. It comes out to the wires and becomes heat, and all of a sudden, you have a thermal problem. Networks on chips help reduce wiring. If you can have fewer wires and be much more efficient with how you connect all of these elements to have these wired devices be efficient, that’s huge.”

    Meanwhile, the suppliers of cloud gaming continue to grow rapidly, with the cloud providers becoming system houses. “They want to control the whole stack,” said Graham. “They want to build their own silicon in the places where they see it as a unique advantage.”

    A flexible SoC platform could potentially be used for multiple product segments, such as mobile, PC, gaming, and AR/VR wearables, said Gervais Fong, senior director of product marketing for mobile, automotive, and consumer interfaces at Synopsys. “It’s amortizing the very high cost of doing these designs across multiple product lines.”

    Fig. 1: A portable gaming device SoC with higher-performance ARC multi-core processors and DesignWare Audio Subsystem. Source: Synopsys

    The GPUs in a gaming system can be fully integrated into an SoC or on a separate piece of silicon. “We have companies that will do a specific, fully integrated SoC,” said Kristof Beets, vice president of product management at Imagination Technologies. “That can be an Arm CPU or RISC-V CPU. They put a GPU next to it, along with everything else they need, and it goes into a set-top box or a mobile phone. Increasingly, we’re also seeing more use in the NVIDIA-style desktop market or laptops. There, the GPU could be a completely separate piece of silicon, a separate graphics chip, so it’s graphics and memory controllers and some interfaces. Or it could be a chiplet, allowing a lower cost to scale up performance. You use one chiplet, two, three, four, and scale up the performance from there. These architectures are making their way into cloud systems, as well, with mixed success.”

    Additionally, system-level verification is needed to measure that protocols such as PCI Express, HDMI, and DisplayPort on customers’ devices are functionally correct, as well as measure the latency and throughput of the systems that are employing those protocols. “You need the correct latency, the correct throughput, with an appropriate total bandwidth, so you know that the environment or the use model that the customer is going to require enables that end-to-end,” explained Cadence’s Graham. “This applies to console manufacturers, graphics/GPU manufacturers, graphics card manufacturers, and others.”

    Upheaval expected in gaming console market
    The market most at risk of a shift in demand is the big home console, such as the PlayStation 5 or Xbox.

    “Do I really want to pay $600 or $700 and put that device in my house?” asked Beets. “Or would I prefer to pay a subscription for $25 with a tiny box, while Microsoft or Sony pay for the big servers in the cloud, providing you’ve got a fast enough connection? That’s the space where we see the most happening. Still, if you’re truly on the move, then mobile gaming doesn’t tend to work, even though mobile phones continue to get more powerful. Devices like the Nintendo Switch and other dedicated handheld consoles are a growing market, and you see a lot of PC-like devices, like the Steam Deck, that are essentially portable computers, but very focused on the gaming experience.”

    Another device that could play a bigger role in gaming is the TV operator set-top box. If it takes off, it’s an easy job to add more compute and AI accelerators to the boxes. “The AI is already doing language models, translation, and other things,” said Synaptics’ Garrett. “When I switch to gaming mode, I reallocate the AI engine so the games can be more powerful without increasing the cost.”

    Within gaming controllers, microcontroller chips play a vital role along with capacitive and touch sensors. “You open up a wider set of developers that can access MCUs, because they’re not requiring a special level of knowledge or expertise, for example, to have someone on your team who can lay out a super high-speed DRAM,” said Steve Tateosian, senior vice president of IoT, consumer, and industrial MCUs at Infineon.

    Fig. 2: A game controller featuring a programmable SoC. Source: Infineon 

    Gaming and XR wearables also feature MCUs. For example, Meta recently developed a surface electromyography (sEMG) wristband with an MCU, IMU, ADC, battery, and Bluetooth antenna, allowing people to control computers with hand gestures.

    Cloud democratizes gaming, but security needed
    For people who cannot afford high-end gaming consoles and controllers, there is great appeal in a future where standards such as Wi-Fi 7 and 6G enable even AAA games to be played on mobile devices via the cloud.

    “If you’re playing games in the cloud, then you don’t need local processing power in your device,” said Adam Hepburn, founder and CEO of Elo, a gaming peripherals company. “That comes with other issues, like you don’t own anything. You don’t have any cold storage if a company goes bankrupt or if they decide to sell your information. But in many countries, people don’t have enough resources to buy a PlayStation. They can buy a phone with a screen and a Wi-Fi signal and play the same games we do. Like in the movie ‘Ready Player One,’ users were charged 25 cents to play the ‘best game ever.’ That’s what it is going to be in the future. Gaming will be financially accessible for the entire world, which removes the limitations between the rich and the poor for this type of media.”

    To solve issues around network speeds and latency, Hepburn suggested the world needs an orbiting mesh around it to deliver internet across the world. “Once we get to that point, then cloud gaming is available.”

    However, as gaming shifts further to the cloud, there are going to be trillions of edge devices filling these enormous clouds, and that won’t be sustainable. “The ratio has to be such that we use AI to decide when to fire that up,” said Synaptics’ Garrett. “OpEx is a big deal. If I have the cloud running for millions of customers, you’re going to be drowning in cost, and the edge is a solution to that.”

    At the same time, game providers would like gaming to be on the cloud because then they control where it is and what the experience is like. “One reason gamers don’t want that to be the case is that they want their games to work whether or not they have an internet connection,” noted Synopsys’ Borza. “If they think they bought permanent access to a game, they should have that right.”

    Another reason gamers prefer the edge is privacy. “People don’t generally want everyone to know what they’re doing, but at some point, convenience trumps the knowledge that the cloud knows what I’m working on,” said Garrett. “If I’m playing this particular game, at this time of night, the cloud has awareness of your behaviors. It’s hard to mask all of that.”

    Gaming security
    Closely related to privacy is security. “Gaming is a good example of an early area in which people started to understand their need for security,” said Borza. “Originally, console manufacturers understood that well and ratcheted up their security over time. But now you’ve had this metamorphosis into massive multiplayer games, and those are cloud-based or server-based. There are active defenses built around the servers and the cloud infrastructure that hosts all of those games, and those fall back to the same kinds of techniques. You need to authenticate the users of the system. Users shouldn’t be able to escalate their privileges, even if they get into something available to them through a bug, or something available to them because they managed to steal somebody else’s credentials and break into the system. They should have several layers of security, and there are differences in the ways that operators of the system get access to the system versus the way the regular players do.”

    The operator’s data needs to be stored completely independently of where the games are being played. “That is the way a well-designed gaming platform will have it,” Borza noted. “There are separate systems firewalled from each other. People try to detect intrusions, so there’s a lot of sensor technology in the network to look for signs that somebody is roaming around in the network doing stuff they shouldn’t be doing or gaining access to things they shouldn’t be getting at.”

    Conclusion
    While the cloud will expose more people to gaming with less expensive equipment, for many gamers, the variety of equipment and gaming cards is part of the appeal.

    “It’s nice to have tacit things to touch and play with and it’s more creative because then we’re not all sharing the same machine that’s running the same specs,” said Imagination’s Ferguson. “From that perspective, cloud gaming can remove a bit of the fun of what kind of hardware you’re running on, because we’re all running the same hardware on a data center somewhere, but it also makes it more accessible for people.”

    The global gaming industry, including games and peripherals, is bigger than the music industry, all of American sports, and all of Hollywood combined, according to Elo’s Hepburn. “It’s massive — over $250 billion. Out of that, the majority of it is mobile gaming, which was $136 billion last year in revenue. Just four years ago, it was $90 billion. A lot of that is going to be legacy games like Candy Crush. But the competitive AAA games are the fastest growing in that industry, because they can now be played on all devices.”

    Like vinyl records, CDs, cassettes, and DVDs, video gaming consoles are becoming less essential, yet it seems unlikely they will disappear. “They will complement other types of devices, including PC and smartphones,” said Arm’s Patel. “But the shape of consoles may change and adapt. It might not be consoles as we know them.”

    Reference

    1. https://www.statista.com/outlook/amo/media/games/worldwide

    Related Reading
    AI Drives More Realistic Gaming
    Neural networks handle graphics while AI agents coach gameplay; hallucinations help fill in the gaps.
    AR/VR Glasses Taking Shape With New Chips
    Smart glasses with augmented reality functions look more natural than VR goggles, but today they are heavily reliant on a phone for compute and next-gen communication.

    Continue Reading

  • ABO blood group and the risk and prognosis of diffuse large B-cell lym

    ABO blood group and the risk and prognosis of diffuse large B-cell lym

    Introduction

    Diffuse large B-cell lymphoma (DLBCL) is an aggressive B-cell lymphoma, the most common pathological type of NHL, accounting for approximately 30% to 40% of all NHL cases across different geographical regions.1,2 The median age at initial diagnosis of DLBCL is over 60 years, and 30% of patients are over 75 years old. The incidence of DLBCL increases with age.3,4 Epidemiological studies indicate that DLBCL has a complex and multifactorial etiology, including genetic characteristics, clinical features, and immune disorders, in addition to risk factors related to viruses, environment, high weight in youth, and occupational exposure.5,6 Although the prognostic significance of the International Prognostic Index (IPI) has been validated in many subtypes of NHL since 1993, its prognostic value in DLBCL remains controversial.

    ABO blood group antigens, which play an important role in the physiology and pathology of cells, are defined by carbohydrate moieties on the extracellular surface of red blood cell membrane.7,8 Our previous research has elaborated on the relationship between ABO blood group and lymphoma, and summarized the current knowledge of the underlying pathogenic mechanisms of the association.9 It has been observed that ABO blood group is not only associated with the risk and prognosis of lymphoma, but may also be associated with the pathological classification of lymphoma patients.9 However, we did not specifically compare DLBCL with other lymphoma subtypes in our previous research. Given this background, we conducted a retrospective study specifically focusing on a representative pathological type, namely DLBCL, with the aim of investigating whether ABO blood group correlates with the risk of onset and prognosis of this disease. This study provides preliminary and exploratory evidence supporting ABO blood group as a potential biomarker for DLBCL. Its cost-effective and readily accessible nature warrants further validation in larger-scale studies, which may offer novel perspectives for future understanding of DLBCL-specific disease risk stratification and prognostic assessment.

    Materials and Methods

    We retrospectively analyzed 220 patients with newly diagnosed DLBCL at two medical institutions between January 2012 and December 2022. The research was conducted in full compliance with the guidelines set forth in the Declaration of Helsinki and obtained official authorization from the Institutional Review Board of the First Affiliated Hospital of Henan University of Science (No. 2024–1592 Fast). All patients with DLBCL participating in this study met the following inclusion criteria: (1) A diagnosis of DLBCL was confirmed by specialized pathologists according to the World Health Organization (WHO) classification. (2) No prior anti-cancer treatment had been administered. (3) Data on ABO blood group was accessible. (4) Sufficient clinical, laboratory, and follow-up records were available. Exclusion criteria include: (1) Transformed from other types of lymphoma to DLBCL. (2) Suffering from other tumors or having a history of tumor. (3) Suffering from other severe systemic diseases.

    The baseline clinical data of patients were collected, including gender, age, Eastern Cooperative Oncology Group performance status (ECOG PS), primary tumor location, extranodal invasion details (sites and count), B symptoms, treatment modalities and response, ABO blood group, Ann Arbor stage, serum lactate dehydrogenase (LDH) levels, baseline serum CRP levels, serum β2-Microglobulin (β2-MG) levels, cellular origin, and IPI score. Overall survival (OS) is defined as the duration extending from the date of first diagnosis until either the occurrence of death from any cause or the last recorded date, when patient data is censored.

    Additionally, we randomly selected age- and sex-matched hospitalized patients as controls (case-control ratio = 1) from the same institutions. Controls were diagnosed with non-malignant, non-hematological, and non-immunological disorders based on surgery or other routine clinical management (eg, hernia, cholelithiasis, osteoarthritis, cataract). Computerized randomization ensured equal numbers of controls per institution relative to DLBCL cases. ABO blood group data for controls were retrieved from hospital information systems (HIS) or laboratory databases using identical procedures as cases.

    Within the DLBCL patient cohort, associations between ABO blood types and baseline clinical/laboratory variables were evaluated using Chi-square test or Fisher’s exact test for categorical data. When performing multiple pairwise comparisons among different blood groups for a specific variable, the Bonferroni correction was applied, adjusting the significance level to α’ = α / [k(k-1)/2], where k represented the number of blood groups, to account for all possible pairwise comparisons. The Log rank test and Kaplan-Meier method was applied for a univariate survival analysis. Variables demonstrating a univariate association with OS at P < 0.2 were included in multivariate Cox proportional hazards regression models. Hazard ratios (HRs) with 95% CIs were reported for significant predictors. A two-tailed P < 0.05 was deemed indicative of statistical significance. The statistical software package SPSS 26.0 (SPSS Inc., Chicago, IL, USA) was used for statistical calculations.

    Result

    Patient Characteristics

    A total of 220 patients diagnosed with DLBCL, including 101 males and 119 females, with a median age of 60 years, were enrolled in the study. The clinical characteristics of the patients are listed in Table 1. Of the enrolled patients, 166 (75.5%) exhibited an optimal performance status (ECOG PS 0–1). B symptoms were present in 76 patients (34.5%). Involvement of at least two extranodal sites was displayed by 81 patients (36.8%). Elevated LDH levels were observed in 111 (50.5%) patients. The serum CRP levels were available for 108 patients, and the serum β2-MG data were available for 158 patients. Localized disease (stage I/II) was observed in 73 patients (33.2%). High-risk disease (IPI ≥ 3) was present in 79 patients (35.9%). Ki-67 antigen levels were available for 195 patients. Among the 220 patients with DLBCL, 115 (73.2%) originated from the non-germinal center B cell-like (GCB) subtype. The ABO blood group exhibited no significant association with patient age, gender, ECOG PS, B symptoms, the number of extranodal sites, LDH levels, CRP levels, serum β2-MG levels, Ann Arbor stage, IPI score, Ki-67 levels, or cellular origin (all P > 0.05, Table 1).

    Table 1 Basic Characteristics of DLBCL Patients in Distinct ABO Blood Type Groups

    The Effect of ABO Blood Group on Risk of DLBCL

    In the DLBCL cohort, the distribution of ABO blood types was as follows: blood type A in 66 patients (30.0%), blood type B in 56 patients (25.5%), blood type AB in 24 patients (10.9%), and blood type O in 74 patients (33.6%). A control group comprising 220 individuals with nonmalignant conditions was randomly selected for comparison. The distribution of ABO blood types within the control group was as follows: blood group A accounted for 65 patients (29.5%), blood group B accounted for 72 patients (32.8%), blood group AB accounted for 17 patients (7.7%), and blood group O accounted for 66 patients (30.0%). No statistically significant disparity was observed in the distribution of ABO blood groups between DLBCL patients and the control cohort (P = 0.301, Supplementary Table 1).

    Upon conducting a gender-stratified comparative analysis, we identified a statistically significant disparity among female patients with DLBCL compared to the control group (P = 0.012, Figure 1). Conversely, an analysis of the ABO blood group distribution among male DLBCL patients relative to the control group revealed no statistically significant differences (P = 0.757, Figure 1).

    Figure 1 Distribution of ABO blood types among DLBCL patients and controls by gender. Significant difference observed in females (P = 0.012, chi-square test); no significant difference observed in males (P = 0.757, chi-square test).

    Abbreviation: DLBCL, diffuse large B cell lymphoma.

    In the study comparing female patients with DLBCL to a female control group without the disease, the prevalence rate of DLBCL were observed to be 54.5%, 34.3%, 70.0%, and 54.1% respectively in individuals with blood type A, B, AB, and O. To account for multiple pairwise comparisons across blood groups, Bonferroni correction was applied, yielding an adjusted significance threshold of α = 0.05/[4(4−1)/2] = 0.0083. Subsequent pairwise analysis demonstrated a significantly lower DLBCL risk in individuals with blood type B compared to blood type AB (P = 0.005, Table 2). No statistically significant differences in DLBCL risk were observed between other blood group pairs (P > 0.0083, Table 2).

    Table 2 DLBCL and the Distribution of ABO Blood Groups in Females

    The Effect of ABO Blood Group on Survival of Patients with DLBCL

    By the conclusion of the final follow-up period, a cumulative total of 77 (35.0%) patients had unfortunately passed away. The deaths were due to tumor progression (n = 69), severe pulmonary infections (n = 5), cardiovascular disease (n = 1), and other causes (n = 2). The 3-year OS rates for blood type A, B, AB, and O groups were 51.0%, 58.8%, 74.9%, and 74.0%, respectively (P = 0.458, Figure 2). Upon stratifying by age groups, we observed that among patients with DLBCL aged over 60 years, the 3-year OS rates for blood type A, B, AB, and O groups were 32.0%, 23.7%, 87.5%, and 69.0%, respectively, yielding a statistically significant difference (P = 0.043, Figure 3a). Considering that DLBCL patients with blood type B had the shortest 3-year OS rate, we categorized those aged over 60 into two distinct groups: blood type B and non-B (A, AB, and O). Patients with blood type B demonstrated a significantly reduced 3-year OS rate compared to those with non-B blood types (23.7% vs 53.6%, P = 0.030, Figure 3b). In contrast, among DLBCL patients aged 60 years or younger, no significant difference in survival rates was observed between individuals with blood type B and those with non-B blood types, with 3-year OS rates of 83.3% and 73.7%, respectively (P = 0.196, Figure 3c). Given that the 3-year OS rates of patients aged over 60 years with A and B blood types were shorter than those with AB and O blood types, we conducted a further comparison between blood type AB/O and blood type A/B to investigate the impact of ABO blood type on survival outcomes. The analysis revealed that the OS for patients with A/B blood types was significantly shorter compared to those with AB/O blood types (P = 0.014, Figure 3d). Notably, the 106 DLBCL patients aged over 60 years shared a similar clinical background (all P > 0.05, Supplementary Table 2).

    Figure 2 The Kaplan-Meier curves for OS in patients with DLBCL according to ABO blood type (P = 0.458 by Log rank test).

    Abbreviation: OS, overall survival; DLBCL, diffuse large B-cell lymphoma; A, blood type A; B, blood type B; AB, blood type AB; O, blood type O.

    Figure 3 The Kaplan-Meier curves for OS in patients with DLBCL according to ABO blood type. (a): OS in patients aged >60 years stratified by blood types A, B, AB, and O c. (b): OS in patients aged >60 years comparing blood type B vs non-B (A, O, and AB) (P = 0.030 by Log rank test). (c): OS in patients aged ≤60 years comparing blood type B vs non-B (A, O, and AB) (P = 0.196 by Log rank test). (d): OS in patients aged >60 years comparing blood types A/B vs AB/O (P = 0.014 by Log rank test).

    Abbreviation: OS, overall survival; DLBCL, diffuse large B-cell lymphoma; A, blood type A; B, blood type B; AB, blood type AB; O, blood type O; A/B, blood type A and blood type B; AB/O, blood type AB and blood type O.

    Univariate and Multivariate Cox Regression Analysis

    Table 3 presented the findings from both univariate and multivariate regression analyses regarding potential predictors of OS in patients with DLBCL aged over 60 years. The univariate analysis indicated that Ann Arbor stage, LDH levels, IPI score, and ABO blood type were significant prognostic factors influencing OS in patients with DLBCL (P < 0.05). Blood type B was linked to a significantly shorter OS when compared to non-B blood types (HR 2.013, 95% CI 1.056–3.839, P = 0.034). In the multivariate analysis, IPI score ≥ 3 (HR 2.247, 95% CI 1.226–4.120, P = 0.009), elevated LDH levels (HR 1.890, 95% CI 1.015–3.520, P = 0.045), and blood type B (HR 2.050, 95% CI 1.069–3.933, P = 0.031) emerged as adverse factors for OS.

    Table 3 Univariate and Multivariate Analysis of Prognostic Factors for OS in DLBCL Patients Aged Over 60 years

    Discussion

    In the present study, we found that females with blood type B might exhibit a reduced risk of DLBCL compared to those with blood type AB. The prognostic implications of ABO blood group distinctions were not apparent across the entire cohort of DLBCL patients. However, our analysis found notable prognostic significance associated with ABO blood group specifically among DLBCL patients aged over 60 years. Among these patients, those with blood type B experienced a significantly shorter OS compared to patients with non-B blood groups.

    The ABO gene is located on chromosome 9q34 and encodes two alleles (ie, A and B) for specific glycosyltransferases that catalyze the covalent linkage of N-acetyl-D-galactosamine or D-galactose to a common precursor side chain (ie, the H antigen), eventually forming A and B antigens respectively.10,11 Unlike the A and B alleles, the O variant encodes a non-functional glycosyltransferase, so the H antigen remains unmodified.12 In recent years, researchers have found a possible association between ABO blood group and the development of cancers. Studies have indicated that individuals with blood type A may be at an increased risk of tumorigenesis, whereas those with blood type B appear to have a reduced risk.13–17 Previous investigations did not observe statistically significant results regarding the correlation between ABO blood group and the risk of DLBCL.18,19 This study provided evidence that among female patients, individuals with blood type B may have exhibited a decreased risk of developing DLBCL in comparison to those with AB blood types.

    Epidemiological studies have shown that the incidence of DLBCL is significantly higher among males compared to females.20 This disparity may be linked to the presence of estrogen in the female population. Studies propose that estrogen potentially exhibits antitumor properties, capable of inhibiting the proliferation and dissemination of tumor cells through a variety of mechanisms.21 It has been reported that the use of high-dose oral contraceptives for pregnancy prevention or exposure to estrogen via postmenopausal hormone replacement therapy may reduce the risk of aggressive lymphoma.22 Furthermore, B-cell lymphomas treated with estrogen receptor β were shown to have effectively inhibit tumor growth in vivo.23 These findings provided additional evidence that estrogen played a significant role in the development and progression of lymphoma. The study suggested that, compared to females with blood type AB, those with blood type B might exhibit a reduced risk of developing DLBCL. The study suggested that, compared to females with blood type AB, those with blood type B might exhibit a reduced risk of developing DLBCL. We hypothesize that this may be partially mediated by the higher estrogen levels typically found in individuals with blood type B, though this remains speculative in the absence of direct hormonal measurements. Further research is warranted to substantiate this hypothesis.

    There were few studies exploring the prognostic relationship between ABO blood groups and DLBCL, and the results were inconsistent. A study in Turkey revealed that there was no significant correlation between ABO blood groups and the prognosis of patients with DLBCL.19 This finding was consistent with the result of this study conducted among the entire cohort of DLBCL patients. Nevertheless, what distinguished it was that our subgroup analysis identified blood type B as a negative prognostic factor specifically for patients older than 60 years. Osada et al reported that DLBCL patients with blood type B had a shorter OS than those with non-B blood types, and this trend was more significant among male DLBCL patients.18 A large-scale, population-based study on DLBCL series showed that male patients had worse prognosis outcomes than female patients.24 Although our study observed similar results in DLBCL patients aged over 60 years, we did not find any relationship between gender and the survival of DLBCL patients.

    The underlying mechanisms of how the ABO blood group may interact with the development and progression of cancers, including lymphoma, are still poorly understood. Several plausible hypotheses have been formulated to elucidate the link between ABO blood group and cancer risk. It is hypothesized that the absence of blood group antigen expression – particularly A and B antigens – may enhance tumor malignancy by increasing cellular motility and migration, thereby correlating with adverse clinical outcomes and poorer overall prognosis.25–27 Studies have indicated that the reduction or absence of ABO blood group antigen expression might be related to the deletion of ABO allele or relative down-regulation of the glycosyltransferase necessary for blood group antigen synthesis caused by hypermethylation of the ABO promoter region.28–32 The absence of ABO blood group antigens has been observed in hematological malignancies, including Hodgkin’s lymphoma (HL).33,34 We hypothesize that analogous mechanisms may be present in patients aged over 60 years with DLBCL, which could lead to the reduction or absence of B-type antigens, ultimately resulting in unfavorable prognostic outcomes. The glycosylation of ABO blood group antigens can lead to conformational changes in proteins that not only affect intercellular signaling, cell adhesion, and immune surveillance, but also stimulate tumor growth and metastasis.35–40 Some studies have reported that the ABO gene locus is associated with circulating levels of tumor necrosis factor-alpha, soluble intercellular adhesion molecule (ICAM)-1, E-selectin, and P-selectin.41–43 These adhesion molecules play a crucial role in the recruitment processes associated with chronic inflammation. Chronic inflammation is linked to tumor growth, invasion, and migration.44–46 Chronic inflammation is also associated with lymphatic malignancies.47 For example, the lymphomas that appear in mice deficient in GM-CSF and IFNγ are caused by infections and subside after antibiotic treatment.48 Although this study did not find a significant association between ABO blood group antigens and CRP, there may be other inflammatory cytokines that serve as intermediaries linking ABO blood group antigens to DLBCL. It is possible that ABO blood group antigens influence tumor progression and metastasis by altering the inflammatory state of the host. ABO glycosyltransferase can regulate plasma von Willebrand factor (vWF) levels, affecting the risk of venous thromboembolism.49,50 vWF plays an important role in inhibiting angiogenesis, promoting wound healing, and inducing tumor cell apoptosis; particularly, angiogenesis and apoptosis are also involved in tumorigenesis.51–54 Therefore, ABO blood group may contribute to the development of tumors by regulating plasma vWF levels.9 In this study, we observed a case of patients with DLBCL and blood type B who died from a pulmonary embolism. We observed one blood type B patient dying from pulmonary embolism, suggesting thromboembolic events as another potential mechanism.

    This study has several limitations. First, Retrospective design inherently restricts causal inference and may introduce unmeasured confounders. Second, Absence of data on estrogen levels precludes validation of the proposed biological hypotheses. Third, the relatively small sample size with regionally constrained recruitment limits population-level generalizability and increases susceptibility to selection bias. Last, reduced statistical power after Bonferroni correction for multiple comparisons may have obscured subtle associations between other blood group.

    Conclusion

    In summary, our research found that females with blood type B may have a lower risk of developing DLBCL compared to females with blood type AB. Furthermore, blood type B may serve as a poor prognostic factor for patients over the age of 60 who have DLBCL. To better understand the role of ABO blood groups in DLBCL, future studies are recommended in a large number of different populations (Asian, Caucasian, African) as well as in various regions.

    Data Sharing Statement

    The original contributions presented in the study are included in the article/supplementary material. Further inquiries can be directed to the corresponding authors.

    Ethics Approval and Consent to Participate

    The studies involving humans were approved by the ethics committee of The First Affiliated Hospital of Henan University of Science and Technology. The studies were conducted in accordance with the local legislation and institutional requirements. All participants confirmed their informed consent by responding to yes/no inquiries. All information collected from this study was treated with utmost confidentiality.

    Author Contributions

    All authors made a significant contribution to the work reported, whether that is in the conception, study design, execution, acquisition of data, analysis and interpretation, or in all these areas; took part in drafting, revising or critically reviewing the article; gave final approval of the version to be published; have agreed on the journal to which the article has been submitted; and agree to be accountable for all aspects of the work.

    Funding

    This work was supported by the Doctoral Research Funds of Henan University of Science and Technology.

    Disclosure

    The authors report no conflicts of interest in this work.

    References

    1. Swerdlow SH, Campo E, Pileri SA, et al. The 2016 revision of the world health organization classification of lymphoid neoplasms. Blood. 2016;127(20):2375–2390. doi:10.1182/blood-2016-01-643569

    2. Armitage JO, Gascoyne RD, Lunning MA, Cavalli F. Non-Hodgkin lymphoma. Lancet. 2017;390(10091):298–310. doi:10.1016/S0140-6736(16)32407-2

    3. Sehn LH, Salles G. Diffuse large b-cell lymphoma. N Engl J Med. 2021;384(9):842–858. doi:10.1056/NEJMra2027612

    4. Liu Y, Barta SK. Diffuse large B-cell lymphoma: 2019 update on diagnosis, risk stratification, and treatment. Am J Hematol. 2019;94(5):604–616. doi:10.1002/ajh.25460

    5. Cerhan JR, Kricker A, Paltiel O, et al. Medical history, lifestyle, family history, and occupational risk factors for diffuse large B-cell lymphoma: the interlymph non-hodgkin lymphoma subtypes project. J Natl Cancer Inst Monogr. 2014;2014(48):15–25. doi:10.1093/jncimonographs/lgu010

    6. De Roos AJ, Schinasi LH, Miligi L, et al. Occupational insecticide exposure and risk of non-hodgkin lymphoma: a pooled case-control study from the interlymph consortium. Int J Cancer. 2021;149(10):1768–1786. doi:10.1002/ijc.33740

    7. Mohandas N, Narla A. Blood group antigens in health and disease. Curr Opin Hematol. 2005;12(2):135–140. doi:10.1097/01.moh.0000153000.09585.79

    8. Storry JR, Olsson ML. The ABO blood group system revisited: a review and update. Immunohematology. 2009;25(2):48–59. doi:10.21307/immunohematology-2019-231

    9. Qin L, Gao D, Wang Q, et al. ABO blood group and the risk and prognosis of lymphoma. J Inflamm Res. 2023;16:769–778. doi:10.2147/JIR.S401818

    10. Yamamoto F. Molecular genetics of ABO. Vox Sang. 2000;78(2):91–103. doi:10.1111/j.1423-0410.2000.tb00045.x

    11. Yamamoto F, Cid E, Yamamoto M, Blancher A. ABO research in the modern era of genomics. Transfus Med Rev. 2012;26(2):103–118. doi:10.1016/j.tmrv.2011.08.002

    12. Lowe JB. The blood group-specific human glycosyltransferases. Baillieres Clin Haematol. 1993;6(2):465–492. doi:10.1016/s0950-3536(05)80155-6

    13. Poole EM, Gates MA, High BA, et al. ABO blood group and risk of epithelial ovarian cancer within the Ovarian Cancer Association Consortium. Cancer Causes Control. 2012;23(11):1805–1810. doi:10.1007/s10552-012-0059-y

    14. Sheng L, Sun X, Zhang L, Su D. ABO blood group and nasopharyngeal carcinoma risk in a population of Southeast China. Int J Cancer. 2013;133(4):893–897. doi:10.1002/ijc.28087

    15. Li X, Xu H, Ding Z, Jin Q, Gao P. Association between ABO blood group and HCV-related hepatocellular carcinoma risk in China. Medicine (Baltimore). 2016;95(49):e5587. doi:10.1097/MD.0000000000005587

    16. Li X, Xu H, Gao P. ABO blood group and diabetes mellitus influence the risk for pancreatic cancer in a population from China. Med Sci Monit. 2018;24:9392–9398. doi:10.12659/MSM.913769

    17. Huang JY, Wang R, Gao YT, Yuan JM. ABO blood type and the risk of cancer – Findings from the Shanghai Cohort Study. PLoS One. 2017;12(9):e0184295. doi:10.1371/journal.pone.0184295

    18. Osada Y, Ito C, Nishiyama-Fujita Y, et al. Prognostic impact of ABO blood group on survival in patients with malignant lymphoma. Clin Lymphoma Myeloma Leuk. 2020;20(2):122–129. doi:10.1016/j.clml.2019.09.607

    19. Ulu BU, Başcı S, Bakırtaş M, et al. Could blood groups have prognostic significance on survival in patients with diffuse large B cell lymphoma. Leuk Res. 2022;115:106810. doi:10.1016/j.leukres.2022.106810

    20. Morton LM, Wang SS, Devesa SS, Hartge P, Weisenburger DD, Linet MS. Lymphoma incidence patterns by WHO subtype in the United States, 1992-2001. Blood. 2006;107(1):265–276. doi:10.1182/blood-2005-06-2508

    21. Pierdominici M, Maselli A, Locatelli SL, et al. Estrogen receptor β ligation inhibits Hodgkin lymphoma growth by inducing autophagy. Oncotarget. 2017;8(5):8522–8535. doi:10.18632/oncotarget.14338

    22. Lee JS, Bracci PM, Holly EA. Non-Hodgkin lymphoma in women: reproductive factors and exogenous hormone use. Am J Epidemiol. 2008;168(3):278–288. doi:10.1093/aje/kwn119

    23. Yakimchuk K, Hasni MS, Guan J, Chao MP, Sander B, Okret S. Inhibition of lymphoma vascularization and dissemination by estrogen receptor β agonists. Blood. 2014;123(13):2054–2061. doi:10.1182/blood-2013-07-517292

    24. Székely E, Hagberg O, Arnljots K, Jerkeman M. Improvement in survival of diffuse large B-cell lymphoma in relation to age, gender, international prognostic index and extranodal presentation: a population based swedish lymphoma registry study. Leuk Lymphoma. 2014;55(8):1838–1843. doi:10.3109/10428194.2013.853297

    25. Hakomori S. Antigen structure and genetic basis of histo-blood groups A, B and O: their changes associated with human cancer. Biochim Biophys Acta. 1999;1473(1):247–266. doi:10.1016/s0304-4165(99)00183-x

    26. Le Pendu J, Marionneau S, Cailleau-Thomas A, Rocher J, Le Moullac-Vaidye B, Clément M. ABH and Lewis histo-blood group antigens in cancer. APMIS. 2001;109(1):9–31. doi:10.1111/j.1600-0463.2001.tb00011.x

    27. Dabelsteen E, Gao S. ABO blood-group antigens in oral cancer. J Dent Res. 2005;84(1):21–28. doi:10.1177/154405910508400103

    28. Stellner K, Hakomori S, Warner GS. Enzymic conversion of “H1-glycolipid” to A or B-glycolipid and deficiency of these enzyme activities in adenocarcinoma. Biochem Biophys Res Commun. 1973;55(2):439–445. doi:10.1016/0006-291x(73)91106-6

    29. Orlow I, Lacombe L, Pellicer I, et al. Genotypic and phenotypic characterization of the histoblood group ABO(H) in primary bladder tumors. Int J Cancer. 1998;75(6):819–824. doi:10.1002/(sici)1097-0215(19980316)75:6<819::aid-ijc1>3.0.co;2-y

    30. Iwamoto S, Withers DA, Handa K, Hakomori S. Deletion of A-antigen in a human cancer cell line is associated with reduced promoter activity of CBF/NF-Y binding region, and possibly with enhanced DNA methylation of A transferase promoter. Glycoconj J. 1999;16(10):659–666. doi:10.1023/a:1007085202379

    31. Kominato Y, Hata Y, Takizawa H, Tsuchiya T, Tsukada J, Yamamoto F. Expression of human histo-blood group ABO genes is dependent upon DNA methylation of the promoter region. J Biol Chem. 1999;274(52):37240–37250. doi:10.1074/jbc.274.52.37240

    32. Gao S, Bennett EP, Reibel J, et al. Histo-blood group ABO antigen in oral potentially malignant lesions and squamous cell carcinoma–genotypic and phenotypic characterization. APMIS. 2004;112(1):11–20. doi:10.1111/j.1600-0463.2004.apm1120103.x

    33. Scott GL, Rasbridge MR. Loss of blood group antigenicity in a patient with Hodgkin’s disease. Vox Sang. 1972;23(5):458–460. doi:10.1111/j.1423-0410.1972.tb03836.x

    34. Bianco T, Farmer BJ, Sage RE, Dobrovic A. Loss of red cell A, B, and H antigens is frequent in myeloid malignancies. Blood. 2001;97(11):3633–3639. doi:10.1182/blood.v97.11.3633

    35. Greenwell P. Blood group antigens: molecules seeking a function. Glycoconj J. 1997;14(2):159–173. doi:10.1023/a:1018581503164

    36. Pinho SS, Reis CA. Glycosylation in cancer: mechanisms and clinical implications. Nat Rev Cancer. 2015;15(9):540–555. doi:10.1038/nrc3982

    37. Stowell SR, Ju T, Cummings RD. Protein glycosylation in cancer. Annu Rev Pathol. 2015;10:473–510. doi:10.1146/annurev-pathol-012414-040438

    38. Xu Y, Chang R, Xu F, et al. N-glycosylation at asn 402 stabilizes n-cadherin and promotes cell-cell adhesion of glioma cells. J Cell Biochem. 2017;118(6):1423–1431. doi:10.1002/jcb.25801

    39. Läubli H, Borsig L. Altered cell adhesion and glycosylation promote cancer immune suppression and metastasis. Front Immunol. 2019;10:2120. doi:10.3389/fimmu.2019.02120

    40. Reily C, Stewart TJ, Renfrow MB, Novak J. Glycosylation in health and disease. Nat Rev Nephrol. 2019;15(6):346–366. doi:10.1038/s41581-019-0129-4

    41. Melzer D, Perry JR, Hernandez D, et al. A genome-wide association study identifies protein quantitative trait loci (pQTLs). PLoS Genet. 2008;4(5):e1000072. doi:10.1371/journal.pgen.1000072

    42. Kiechl S, Paré G, Barbalic M, et al. Association of variation at the ABO locus with circulating levels of soluble intercellular adhesion molecule-1, soluble P-selectin, and soluble E-selectin: a meta-analysis. Circ Cardiovasc Genet. 2011;4(6):681–686. doi:10.1161/CIRCGENETICS.111.960682

    43. Barbalic M, Dupuis J, Dehghan A, et al. Large-scale genomic studies reveal central role of ABO in sP-selectin and sICAM-1 levels. Hum Mol Genet. 2010;19(9):1863–1872. doi:10.1093/hmg/ddq061

    44. Fernandes JV, Cobucci RN, Jatobá CA, et al. The role of the mediators of inflammation in cancer development. Pathol Oncol Res. 2015;21(3):527–534. doi:10.1007/s12253-015-9913-z

    45. Singh R, Mishra MK, Aggarwal H. Inflammation, Immunity, and Cancer. Mediators Inflamm. 2017;2017:6027305. doi:10.1155/2017/6027305

    46. Greten FR, Grivennikov SI. Inflammation and cancer: triggers, mechanisms, and consequences. Immunity. 2019;51(1):27–41. doi:10.1016/j.immuni.2019.06.025

    47. Grivennikov SI, Greten FR, Karin M. Immunity, inflammation, and cancer. Cell. 2010;140(6):883–899. doi:10.1016/j.cell.2010.01.025

    48. Enzler T, Gillessen S, Manis JP, et al. Deficiencies of GM-CSF and interferon gamma link inflammation and cancer. J Exp Med. 2003;197(9):1213–1219. doi:10.1084/jem.20021258

    49. Ibrahim-Kosta M, Bailly P, Silvy M, et al. ABO blood group, glycosyltransferase activity and risk of venous thromboembolism. Thromb Res. 2020;193:31–35. doi:10.1016/j.thromres.2020.05.051

    50. Ward SE, O’Sullivan JM, O’Donnell JS. The relationship between ABO blood group, von Willebrand factor, and primary hemostasis. Blood. 2020;136(25):2864–2874. doi:10.1182/blood.2020005843

    51. Starke RD, Ferraro F, Paschalaki KE, et al. Endothelial von Willebrand factor regulates angiogenesis. Blood. 2011;117(3):1071–1080. doi:10.1182/blood-2010-01-264507

    52. Franchini M, Frattini F, Crestani S, Bonfanti C, Lippi G. von Willebrand factor and cancer: a renewed interest. Thromb Res. 2013;131(4):290–292. doi:10.1016/j.thromres.2013.01.015

    53. O’Sullivan JM, Preston R, Robson T, O’Donnell JS. Emerging roles for von willebrand factor in cancer cell biology. Semin Thromb Hemost. 2018;44(2):159–166. doi:10.1055/s-0037-1607352

    54. Ishihara J, Ishihara A, Starke RD, et al. The heparin binding domain of von Willebrand factosr binds to growth factors and promotes angiogenesis in wound healing. Blood. 2019;133(24):2559–2569. doi:10.1182/blood.2019000510

    Continue Reading

  • July C/A posts surplus for 27th straight month with USD 10.78B – Korea.net

    1. July C/A posts surplus for 27th straight month with USD 10.78B  Korea.net
    2. South Korea’s Current Account Surplus Declines in July  MarketScreener
    3. S. Korea Posts Current Account Surplus on Strong Exports  TradingView
    4. Korea posts record July current account surplus on robust exports: BOK  The Korea Times

    Continue Reading

  • Conference League league phase squads confirmed – UEFA.com

    1. Conference League league phase squads confirmed  UEFA.com
    2. Rijeka vs. AEK Larnaca – Boxscore – November 27, 2025  FOX Sports
    3. Crystal Palace to face Dynamo Kyiv, Strasbourg in Conference League  France 24
    4. Monaco Soccer Conference League Draw  Herald and News
    5. Here’s Fiorentina’s Conference League draw  Yahoo Sports

    Continue Reading

  • Why are passive income investors only holding stocks for 3.6 years?

    Why are passive income investors only holding stocks for 3.6 years?

    Image source: Getty Images

    Are we all getting less patient? With little rectangular screens sucking up our attention like vampires, I wouldn’t be surprised to learn that attention spans are becoming shorter. But even I’m shocked to discover that our declining forbearance even extends to the world of investing and passive income.

    Warren Buffett said he liked to hold stocks “forever”. And the Foolish view we have on this website is about aiming for 10 years or more.

    But a recent survey found that the average holding period was just seven years in 2016. And it had declined to only 3.6 years by 2023. These days, the average stock isn’t held long enough for the next Olympics to roll around.

    They say patience is a virtue. Well, the modern stock market investor doesn’t seem to agree.

    When we buy a stock, we generally want to have an investment case. In simple terms, an investment case is the reasoning behind why the stock will achieve what we want from it. Whether we’re buying for years of dividends or to beat the market, doing the research and finding compelling justification to make the purchase tends to yield better results.

    A fly in the ointment is that even the best investment cases don’t work instantly. Warren Buffett banged on constantly about how irrational the stock market was in the short term but how accurately the market would represent a stock’s true value in the long term.

    Investment cases need time to work their magic, and three or four years is simply not long enough. Such short holding periods are more the realm of speculators (or gamblers), which is perhaps on the increase.

    Here’s an example investment case using a stock from my own portfolio, Games Workshop (LSE GAW). I view the Nottingham-based tabletop game seller as one to consider because of its beloved fantasy world excelling at a time when other fantasy worlds are imploding (cough cough, Star Wars).

    A stake bought in Games Workshop in 2012, even with a rock-solid investment case, would have yielded no share price gains until 2016 (those four years put the holding time above average these days, remember). Sell after four years? Close to zero return.

    The alternative decision however of holding the shares would have reaped bountiful rewards. They flew up 30 times in value between 2016 and 2025. I believe my own investment case remains intact to this day, evidenced by regular profit beats along with forecast earnings and sales growth in the years to come too.

    That’s not to say I’ll hold the shares forever. Threats like rising supply costs and wages are tricky to manage for a company that manufactures everything in Britain so I may sell at some point.

    Continue Reading