Mangaluru: Niveus Solutions Pvt. Ltd., an award-winning Google Cloud partner on Thursday inaugurated its largest office in the country here in Mangaluru.

Sprawling across 16,000 sq. ft, with a seating capacity of 210 staff, it is the largest office of the company in India. The company has around 300 employees working from various locations in India and Singapore.

Speaking after the inauguration of the new office, actor Rakshith Shetty said he was surprised to see Mangaluru and Udupi becoming base for a global cloud engineering organization such as Niveus. He said there was lot of untapped potential in the region, be it in technology, medicine, or even arts. When the talent gets access to the right infrastructure, there was no limit to what they could achieve, he said.

Niveus CEO Suyog Shetty said the organization has seen tremendous growth in business and operations in recent times, registering an over 300% growth year on year. The Mangaluru office offers opportunities to software professionals in the region where the immense pool of talent is available from the education hub. The setup is another key step to attracting and retaining the best of talents in the region, he said.

Niveus has been empowering industry leaders, including top private banks and leading asset management companies, and customers to leverage cloud technologies and harness the power of cloud services to build resilient infrastructures that scale. It recently expanded to the ASEAN region, setting up a hub in Singapore and onboarding new customers.

Niveus was funded in 2013 by Suyog Shetty, Rashmi George, Roshan Bava, and Mohsin Khan.

Let the Truth be known. If you read VB and like VB, please be a VB Supporter and Help us deliver the Truth to one and all.



Sydney (PTI): (The Conversation) In late November, Australia’s federal parliament passed landmark legislation banning under-16s from accessing social media.

Details remain vague: we don’t have a complete list of which platforms will fall under the legislation, or how the ban will look in practice. However, the government has signalled that trials of age assurance technologies will be central to its enforcement approach.

Video games and online game platforms are not currently included in Australia’s ban of social media. But we can anticipate how enforcing an online ban might (not) work by looking at China’s large-scale use of age verification technologies to restrict young people’s video game consumption.

In China, strict regulations limit children under 18 to just one hour of online gaming on specified days. This approach highlights significant challenges in scaling and enforcing such rules, from ensuring compliance to safeguarding privacy.


‘Spiritual opium’: video games in China


China is home to a large video game industry. Its tech giants, like Tencent, are increasingly shaping the global gaming landscape. However, the question of young people’s consumption of video games is a much thornier issue in China.

The country has a deep cultural and social history of associating video games with addiction and harm, often referring to them as “spiritual opium”. This narrative frames gaming as a potential threat to the physical, mental and social wellbeing of young people.

For many Chinese parents, this perception shapes how they view their children’s play. They often see video games as a disruptive force that undermines academic success and social development.

Parental anxiety like this has paved the way for China to implement strict regulations on children’s online gaming. This approach has received widespread parental support.

In 2019, China introduced a law to limit gaming for under 18-year-olds to 90 minutes per day on weekdays and three hours on weekends. A “curfew” would prohibit gameplay from 10pm to 8am.

A 2021 amendment further restricted playtime to just 8pm to 9pm on Fridays, Saturdays, Sundays and public holidays.

In 2023, China expanded this regulatory framework beyond online gaming to include livestreaming platforms, video-sharing sites and social media. It requires the platforms to build and complete “systems for preventing addiction”.


How is it enforced?


Leading game companies in China are implementing various compliance mechanisms to ensure adherence to these regulations. Some games have incorporated age-verification systems, requesting players to provide their real name and ID for age confirmation.

Some even introduced facial recognition to ensure minors’ compliance. This approach has sparked privacy concerns.

In parallel, mobile device manufacturers, app stores and app developers have introduced “minor modes”. This is a feature on mobile games and apps that limits user access once a designated time limit has been reached (with an exception for apps pre-approved by parents).

A November 2022 report by the China Game Industry Research Institute – a state-affiliated organisation – declared success. Over 75% of minors reportedly spent fewer than three hours a week gaming, and officials claimed to have curbed “internet addiction”.

Yet these policies still face significant enforcement challenges, and highlight a wider set of ethical issues.


Does it work?


Despite China’s strict rules, many young players find ways around them. A recent study revealed more than 77% of the minors surveyed evaded real-name verification by registering accounts under the names of older relatives or friends.

Additionally, a growing black market for game accounts has emerged on Chinese commerce platforms. These allow minors to rent or buy accounts to sidestep restrictions.

Reports of minors successfully outsmarting facial recognition mechanisms – such as by using photos of older individuals – underscore the limits of tech-based enforcement.

The regulation has also introduced unintended risks for minors, including falling victim to scams involving game account sellers. In one reported case, nearly 3,000 minors were collectively scammed out of more than 86,000 yuan (approximately A$18,500) while attempting to bypass the restrictions.


What can Australia learn from China?


The Chinese context shows that a failure to engage meaningfully with young people’s motivations to consume media can end up driving them to circumvent restrictions.

A similar dynamic could easily emerge in Australia. It would undermine the impact of the government’s social media ban.

In the lead-up to the law being introduced, we and many colleagues argued that outright bans enforced through technological measures of questionable efficacy risk being both invasive and ineffective. They may also increase online risks for young people.

Instead, Australian researchers and policymakers should work with platforms to build safer online environments. This can be done by using tools such as age-appropriate content filters, parental controls and screen time management features, alongside broader safety-by-design approaches.

These measures empower families while enabling young people to maintain digital social connections and engage in play. These activities are increasingly recognised as vital to children’s development.

Crucially, a more nuanced approach fosters healthier online habits without compromising young people’s privacy or freedom.