Connect with us

Technology

Google Doodle Celebrates 103rd Anniversary of ‘Hua Lamphong’

Published

on

Only 103 years after the Bangkok Railway Station, it is known as the unauthorized lamphong, which opened its doors for travelers first. Albeit Bang Soo Mega Station will eventually become the capital’s main rail transportation center, Lamphong holds the title as Bangkok’s most seasoned train station – with a lavish neoclassical design, highlighting an iron rooftop with recolored glass windows. There is when trains were the final hotel methods for transport.

The station is formally referred to by the State Railway of Thailand as Sathani Rotfai Krung Thep in Thai (Krung Thep is the transliteration of the common Thai language name of Bangkok) and Bangkok Station in English. Hua Lamphong is the casual name of the station, utilized by both foreign travellers and locals. The station is often named as Hua Lamphong in travel guide books and in the public press.

In other areas of Thailand the station is usually alluded to as Krungthep Station, and the name Hua Lamphong isn’t well-known.

During his 1907 tour through Europe, King Rama V was affected by Frankfurt’s train station with the goal that he built a comparable structure for his nation. Italian architects Mario Tamagno and Annabel Rigotti resounded a few subtleties of the German station in their design, from the front of the open-air passenger galleries, at a huge clock on Front Gable.

For over a century, Lumphong has been the entry point of Bangkok for many visitors. The station associates the MRT underground system, and you can catch both rural commuter lines and the magnificent Orient Express from here. There are approximately 200 trains operated daily in the State Government of Thailand, which has more than 27,000 passengers, and finally the Railway History Museum.

Mark David is a writer best known for his science fiction, but over the course of his life he published more than sixty books of fiction and non-fiction, including children's books, poetry, short stories, essays, and young-adult fiction. He publishes news on apstersmedia.com related to the science.

Technology

Qualcomm Broadens Snapdragon X Series for AI-Powered PCs

Published

on

A well-known chip manufacturer worldwide, Qualcomm has announced plans to increase the range of its Snapdragon X Series products. According to the details, the new expansion will bring a new platform that will benefit more Windows PCs by offering better performance, longer battery life, and on-device AI capabilities.

Beginning in the middle of 2024, original equipment manufacturers (OEMs) plan to release PCs with Snapdragon X Plus and Snapdragon X Elite technology together.

The company released a statement describing the Snapdragon X Plus’s “Qualcomm Oryon CPU,” a specially integrated processor that offers up to 37% faster performance than rival models while using up to 54% less power.

“With radical new AI experiences emerging in this period of rapid development and deployment, Snapdragon X Plus will power AI-supercharged PCs that enable even more users to excel,” stated Kedar Kondap, SVP and general manager of Qualcomm Technologies’ compute and gaming division.

Kondap continued, “We are once again pushing the boundaries of what is possible in mobile computing by delivering leading CPU performance, AI capabilities, and power efficiency.”

The Qualcomm Hexagon NPU, which can process 45 TOPS (tera operations per second), powers the chip, which is said to be the fastest laptop NPU in the world. It is made to meet the demands of on-device AI-driven applications.

The Snapdragon 7+ Gen 3 chipset, the newest member of the Snapdragon 7-series lineup, is released by Qualcomm. With major improvements to both performance and features, it is positioned as the most sophisticated mid-range chipset to date. Its compatibility with on-device generative AI models, including Baichuan-7B, Llama 2, and Gemini Nano, is one noteworthy improvement. With this chipset, CPU performance is improved by up to 15% and GPU performance is impressively improved by 45% over its predecessors.

Photography enthusiasts will find the Snapdragon 7+ Gen 3 chipset ideal as one of its notable features is support for taking 200-megapixel high-resolution photos. To ensure faster and more dependable wireless connectivity, it also has advanced connectivity options like Wi-Fi 7. Users can also enjoy lightning-fast 5G speeds because it is compatible with both Sub6 and mmWave 5G networks.

Continue Reading

Technology

Neura AI Blockchain Opens Public Testnet for Mainnet Development

Published

on

The “Road to Mainnet” campaign by Neura AI Blockchain lays out a complex roadmap that is expected to propel the mainnet to success. With its smooth integration of AI, Web3, and Cloud computing, this much anticipated Layer-1 blockchain offers state-of-the-art Web3 solutions.

Neura has started a new collection on Galxe to commemorate this accomplishment and give users the chance to win a unique Neura NFT.

Neura’s strategy plan outlines how to get the Neura Network in front of development teams that are excited to explore the potential of blockchain technology. Neura AI Blockchain solves issues faced by many Web3 startups with features like an Initial Model Offering (IMO) framework and a decentralized GPU marketplace.

Web3 developers are invited to participate in the AI Innovators campaign, which Neura has launched to demonstrate its capabilities, in exchange for tempting prizes.

This developer competition aims to showcase Neura Blockchain’s AI and platform capabilities, supporting its ecosystem on the Road to Mainnet, rather than just be a competitive event.

Neura Blockchain is at the forefront of utilizing blockchain and artificial intelligence in a world where these technologies are rapidly developing. Because of its custom features that unlock the best AI features in the Web3 space, its launch in 2024 is something to look forward to.

The Road to Mainnet public testnet competition, according to Neura, will highlight important Web3 features like improving the effectiveness of deploying and running AI models, encouraging user participation, and creating a positive network effect among these overlapping technologies.

Continue Reading

Technology

Microsoft Introduces Phi-3 Mini, its Tiniest AI Model to date

Published

on

The Phi-3 Mini, the first of three lightweight models from Microsoft, is the company’s smallest AI model to date.

Microsoft is exploring models that are trained on smaller-than-usual datasets as an increasing number of AI models enter the market. According to The Verge, Phi-3 Mini is now available on Hugging Face, Ollama, and Azure. It has 3.8 billion parameters, or the number of complex instructions a model can understand. Two more models are planned for release. Phi-3 Medium and Phi-3 Small measure 14 billion parameters and seven bullion parameters, respectively. It is estimated that ChatGPT 4 contains more than a trillion parameters, to put things into perspective.

Released in December 2023, Microsoft’s Phi-2 model has 2.7 billion parameters and can achieve performance levels comparable to some larger models. According to the company, Phi-3 can now perform better than its predecessor, providing responses that are comparable to those that are ten times larger.

Benefits of the Phi-3 Mini

Generally speaking, smaller AI models are less expensive to develop and operate. Because of their compact design, they work well on personal computers and phones, which facilitates their adaptation and mass market introduction.

Microsoft has a group devoted to creating more manageable AI models, each with a specific focus. For instance, as its name would imply, Orca-Math is primarily concerned with solving math problems. T.

There are other companies that are focusing on this field as well. For example, Google has Gemma 2B and 7B that are focused on language and chatbots, Anthropic has Claude 3 Haiku that is meant to read and summarize long research papers (just like Microsoft’s CoPilot), and Meta has Llama 3 8B that is prepared to help with coding.

Although smaller AI models are more suitable for personal use, businesses may also find use for them. These AI models are ideal for internal use since internal datasets from businesses are typically smaller, they can be installed more quickly, are less expensive, and easier to use.

Continue Reading

Trending

error: Content is protected !!