News
22 Jan 2026, 16:05
XRP Payments Rise to 1.346 Million Within 24 Hours

High-frequency digital payment activity often reveals adoption and trust long before price reflects it. When transaction volumes spike , it signals that a network is actively used, not just speculated upon. XRP now demonstrates this dynamic, showcasing its increasing relevance as both a transactional medium and settlement layer. This surge was highlighted by Amonyx on X, who reported that XRP processed 1.346 million payments within a single 24-hour period. The milestone underscores the XRP Ledger’s scalability and reliability, emphasizing Ripple’s continued ability to support high-volume , real-world financial transactions. Record Payment Volume Highlights Network Efficiency XRP’s daily transaction figure demonstrates the network’s operational strength. The XRP Ledger processes payments in under a second and maintains minimal fees, enabling rapid movement of value across corridors. Handling over 1.3 million payments in just 24 hours confirms the ledger’s capacity to sustain high throughput without congestion or delays. BULLISH: #XRP payments rise to 1.346 million within 24 hours. — Amonyx (@amonyx) January 21, 2026 Analysts note that such usage trends often precede further adoption. As more businesses, payment providers, and financial institutions integrate XRP, transaction activity may continue to scale, reflecting confidence in the network’s reliability and efficiency. Implications for Ripple’s Ecosystem Ripple’s infrastructure supports the widespread adoption of XRP through liquidity solutions and cross-border payment corridors. Daily transaction surges, like the one highlighted by Amonyx, indicate that participants increasingly use XRP for practical purposes rather than speculation alone. The network’s growing transactional activity reinforces its role as a foundational tool in Ripple’s payment ecosystem. High payment volumes also carry broader significance for market perception . Increased usage signals trust among participants, attracting both retail and institutional engagement. By demonstrating consistent network performance under heavy activity, XRP positions itself as a viable alternative to slower or costlier payment methods. We are on X, follow us to connect with us :- @TimesTabloid1 — TimesTabloid (@TimesTabloid1) June 15, 2025 Future Outlook and Adoption Trends Sustained growth in XRP payments suggests continued momentum for the crypto within global financial systems. As Ripple expands its partnerships and liquidity offerings, daily transaction volumes are likely to rise further, reflecting increasing reliance on XRP for low-cost, near-instant transfers. Amonyx’s report reinforces the narrative that XRP is evolving from being a speculative asset into a practical, high-capacity financial tool . The milestone of 1.346 million payments in 24 hours illustrates the network’s growing utility and sets the stage for continued adoption, signaling that XRP’s influence in digital payments may expand rapidly in the months ahead. With robust throughput and expanding real-world use, XRP is proving that its network adoption may become one of the clearest indicators of long-term ecosystem growth. Disclaimer : This content is meant to inform and should not be considered financial advice. The views expressed in this article may include the author’s personal opinions and do not represent Times Tabloid’s opinion. Readers should carry out in-depth research before making any investment decisions. Any action taken by the reader is strictly at their own risk. Times Tabloid is not responsible for any financial losses. Follow us on Twitter , Facebook , Telegram , and Google News The post XRP Payments Rise to 1.346 Million Within 24 Hours appeared first on Times Tabloid .
22 Jan 2026, 15:50
Optical processors breakthrough: Neurophos secures $110M to revolutionize AI inferencing with metamaterial technology

BitcoinWorld Optical processors breakthrough: Neurophos secures $110M to revolutionize AI inferencing with metamaterial technology AUSTIN, Texas — In a significant development for artificial intelligence infrastructure, photonics startup Neurophos has secured $110 million in Series A funding to commercialize optical processors that could dramatically reduce the energy consumption of AI inferencing. The company’s technology, derived from metamaterials research originally developed for electromagnetic cloaking, promises processing speeds and efficiency gains that challenge Nvidia’s dominance in AI hardware. From invisibility research to optical computing revolution The journey from electromagnetic cloaking to optical processors represents a remarkable evolution in materials science. Two decades ago, Duke University professor David R. Smith pioneered metamaterials research that demonstrated limited invisibility effects at microwave frequencies. Today, that same fundamental research has spawned Neurophos, a startup developing what it calls “metasurface modulators” for optical computing. These modulators function as tensor core processors specifically designed for matrix vector multiplication—the mathematical operation at the heart of AI inferencing. Unlike traditional silicon-based GPUs and TPUs that use electrical signals, Neurophos’s optical processing units (OPUs) manipulate light to perform calculations. This approach offers several theoretical advantages, including reduced heat generation and faster signal propagation. The $110 million funding round and strategic investors Neurophos’s recent funding round attracted significant attention from major technology investors. Gates Frontier, Bill Gates’s venture firm, led the investment, with participation from Microsoft’s M12 fund, Carbon Direct, Aramco Ventures, Bosch Ventures, Tectonic Ventures, and Space Capital. This diverse investor group reflects broad industry recognition of the potential impact of optical computing on AI infrastructure. Dr. Marc Tremblay, corporate vice president and technical fellow of core AI infrastructure at Microsoft, emphasized the technology’s importance in a statement: “Modern AI inference demands monumental amounts of power and compute. We need a breakthrough in compute on par with the leaps we’ve seen in AI models themselves, which is what Neurophos’s technology and high-talent density team is developing.” Technical specifications and competitive advantages Neurophos claims its optical processors achieve remarkable performance metrics compared to current market leaders. According to company data, their chip operates at 56 GHz, delivering 235 Peta Operations Per Second (POPS) while consuming just 675 watts. In contrast, Nvidia’s B200 AI GPU reportedly delivers 9 POPS at 1,000 watts—a significant efficiency gap. Performance Comparison: Neurophos vs. Nvidia Metric Neurophos OPU Nvidia B200 GPU Processing Speed 56 GHz Not specified Peak Performance 235 POPS 9 POPS Power Consumption 675 watts 1,000 watts Energy Efficiency ~0.35 POPS/watt ~0.009 POPS/watt The company’s key innovation lies in the miniaturization of optical components. Traditional optical transistors face significant size limitations, but Neurophos claims its metasurface modulators are approximately 10,000 times smaller than conventional optical components. This miniaturization enables the company to fit thousands of units on a single chip, dramatically increasing computational density. Solving photonic computing’s traditional challenges Photonic computing has long promised advantages over silicon-based electronics, but several persistent challenges have limited commercial adoption. Optical components typically require: Larger physical footprints than silicon transistors Complex data conversion between optical and electronic domains Specialized manufacturing processes incompatible with existing foundries Neurophos addresses these issues through its metasurface technology. The company’s chips can reportedly be manufactured using standard silicon foundry materials, tools, and processes. Additionally, the reduced size of their optical components minimizes the need for frequent data conversion between domains, improving overall efficiency. Dr. Patrick Bowen, CEO and co-founder of Neurophos, explained the technical approach: “When you shrink the optical transistor, you can do way more math in the optics domain before you have to do that conversion back to the electronics domain. If you want to go fast, you have to solve the energy efficiency problem first.” Market timing and competitive landscape Neurophos enters a rapidly evolving AI hardware market dominated by Nvidia, which currently supplies the majority of GPUs powering AI training and inference. However, the company positions its technology as complementary rather than directly competitive, focusing specifically on inference workloads where energy efficiency matters most. The startup acknowledges it faces competition from other photonics companies, though some competitors like Lightmatter have shifted focus to optical interconnects rather than processing units. Neurophos expects its first chips to reach the market by mid-2028, giving the company several years to refine its technology while the AI hardware market continues to expand. Bowen remains confident about the company’s competitive position: “What everyone else is doing, including Nvidia, in terms of the fundamental physics of the silicon, it’s really evolutionary rather than revolutionary. Even if we chart out Nvidia’s improvement in architecture over the years, by the time we come out in 2028, we still have massive advantages.” The environmental imperative for efficient AI The timing of Neurophos’s technology development coincides with growing concerns about AI’s environmental impact. Recent studies indicate that data center energy consumption could double by 2026, driven largely by AI workloads. More efficient processing hardware represents a critical component of sustainable AI development. Neurophos’s optical processors could significantly reduce the carbon footprint of AI inference, which constitutes the majority of computational workload for deployed AI systems. The company claims its technology offers 50x improvements in both energy efficiency and raw speed compared to Nvidia’s current Blackwell architecture. Implementation roadmap and commercial strategy The $110 million funding will support several key initiatives over the coming years. Neurophos plans to develop its first integrated photonic compute system, including datacenter-ready OPU modules, a complete software stack, and early-access developer hardware. The company is also expanding its physical presence with a new engineering site in San Francisco and an expanded headquarters in Austin, Texas. Neurophos has already signed multiple customers, though the company has not disclosed their identities. Microsoft is reportedly “looking very closely” at the startup’s products, suggesting potential integration with Azure’s AI infrastructure. The company’s technology could eventually benefit various applications, including: Large language model inference for chatbots and content generation Computer vision systems for autonomous vehicles and surveillance Scientific computing requiring massive matrix operations Edge AI applications where power constraints are critical Conclusion Neurophos’s $110 million funding round represents a significant vote of confidence in optical computing’s potential to transform AI infrastructure. The company’s metamaterial-based optical processors promise unprecedented efficiency gains for AI inferencing, addressing both performance demands and environmental concerns. While commercial availability remains several years away, the technology could eventually challenge silicon’s dominance in high-performance computing. As AI models grow increasingly complex and energy-intensive, innovations like Neurophos’s optical processors may prove essential for sustainable AI development. FAQs Q1: What are optical processors and how do they differ from traditional chips? Optical processors use light rather than electricity to perform computations. They offer potential advantages in speed and energy efficiency because light generates less heat, travels faster, and is less susceptible to electromagnetic interference than electrical signals. Q2: How does Neurophos’s technology relate to metamaterials research? Neurophos’s optical processors use metasurface modulators derived from metamaterials research originally developed for electromagnetic cloaking. These artificial materials manipulate light in ways natural materials cannot, enabling miniaturized optical computing components. Q3: When will Neurophos’s optical processors be commercially available? The company expects its first chips to reach the market by mid-2028. The current funding will support development of datacenter-ready modules, software stacks, and early-access hardware over the next several years. Q4: How do optical processors address AI’s environmental impact? AI inference consumes substantial energy in data centers. Neurophos claims its optical processors offer 50x improvements in energy efficiency compared to current GPUs, potentially reducing the carbon footprint of AI applications significantly. Q5: What challenges does photonic computing face compared to traditional silicon? Traditional photonic components are larger than silicon transistors, require frequent data conversion between optical and electronic domains, and have faced manufacturing challenges. Neurophos addresses these through miniaturized metasurfaces compatible with standard silicon foundry processes. This post Optical processors breakthrough: Neurophos secures $110M to revolutionize AI inferencing with metamaterial technology first appeared on BitcoinWorld .
22 Jan 2026, 15:25
Binance Wallet Unveils Three AI Tools to Simplify Web3 Market Research

Binance Wallet has introduced three new AI–powered features designed to help users better navigate the fast-moving Web3 market. The update aims to reduce information overload by transforming large volumes of on-chain and social data into more accessible insights, making crypto research easier to manage. Visit Website
22 Jan 2026, 15:01
Julie Sweet calls data centers a critical national asset for AI growth at Davos

Accenture CEO Julie Sweet today called on governments worldwide to place data center infrastructure at the core of their national artificial intelligence (AI) strategies, stressing that computational capacity must keep pace with the rapid pace of AI innovation. Speaking on the sidelines of the World Economic Forum’s annual Davos meeting, Sweet highlighted that data centers, the physical backbone where AI systems compute and store data, are no longer a technical afterthought but a critical national asset. She told reporters that if countries want to harness the transformative power of AI, they must treat data centers as central infrastructure , like roads or power grids. “Data centers shouldn’t just be an afterthought in national AI plans,” Sweet told delegates and media. “They are the foundation of secure, scalable, and sovereign AI deployment that will drive future economic growth and protect data privacy.” Accenture Research has been training staff in new tech skills for years now Earlier this month, Accenture struck a deal to buy Faculty, a British artificial intelligence company, as it seeks to position itself as a technology frontrunner. The firm has also been retraining staff in new technologies while phasing out employees who fail to adapt since 2019. Sweet shared , “Technology is a basic skill now. And once you start making that investment, people get less scared about their jobs because they know you’re invested in making sure they can do the next jobs.” Sweet also pointed out that designing new entry-level jobs requires educational support and cooperation with schools and universities, and that such efforts are still in the initial stages. Her statements come in the wake of Nvidia Jensen Huang’s Davos comments that the AI surge will allow plumbers, electricians, and construction workers to earn six-figure incomes building data centers. Sweet also said that the critical error for CEOs is treating AI as a target in its own right rather than aligning it with their company’s goals, noting that their business strategy should come first. Sweet noted that business leaders are seeing AI as more than just a cost-cutting tool Sweet also told reporters on the sidelines of the World Economic Forum in Davos that an increasing number of executives now view AI as a growth driver rather than merely a cost-reduction tool, adding that she is optimistic about agentic commerce and other AI applications. A Pulse of Change survey by Accenture Research found that most leaders see AI as a tool for growth, with revenue gains outweighing its cost-saving potential. She asserted, “Companies are led by humans, and they will win by tapping into human creativity, adding that the best AI future would be using technology as a tool rather than relegating people to a supporting role. Sweet argued that being a “human in the loop” isn’t particularly inspiring for people. In Accenture’s survey, 83% of non-executive employees believe their companies would continue investing in AI in ways that benefit both staff and business results, regardless of an AI downturn. However, the survey found that only one in five respondents feels they are actively helping shape how AI changes work. Even fewer—17%—said they enjoy using it and exploring new applications. Sweet also acknowledged that while many employees use AI in their personal lives, there is still considerable anxiety about it in the workplace. Late last year, the Accenture executive had insisted that it would take at least a few years for most companies to move beyond the slow, hard phase of artificial intelligence. At the time, she explained that most business leaders and processes need to change to bolster the technology. Get seen where it counts. Advertise in Cryptopolitan Research and reach crypto’s sharpest investors and builders.
22 Jan 2026, 12:50
Bitcoin diamond hand BTC selling not ‘repeat of 2017, 2021,’ research warns

Bitcoin long-term holders of two years or more broke records during 2024 and 2025, says a new analysis of the latest bull market.
22 Jan 2026, 12:46
Gen Z Americans Trust Crypto More Than Banks, Seeking ‘Agency and Control’

Consumer research shows trust for crypto is shaped by control and access, with their habits now reaching policy moves on housing.













































