- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Patient Protection and Affordable Care Act (ACA), commonly known as Obamacare, represented the most significant overhaul of the US Healthcare system since the passage of Medicare and Medicaid in 1965.
ACA’s future might be uncertain at the moment. But the story of its implementation provides essential learnings for ICT officials striving to achieve the digital transformation agenda. Notwithstanding certain unique characteristics, many of the challenges from ACA appear often in the implementation of massive scale, complex projects. Some of those challenges are internal, some are external.
The centrality of IT to business, whether government or otherwise, means that ICT projects do not happen in a vacuum. They are susceptible to factors beyond the control of ICT executives, such as political headwinds, expanding the range of contingencies to be taken into account.
A big part of the ACA is HealthCare.gov, the website that offers insurance through exchanges operated by the federal government, to serve the residents of the U.S. states without their own state exchanges.
ICT requirements included integration of a dizzying array of disparate data from national and state government entities and private and public sector players. It would be called on to process thousands of complex transactions per second. Hundreds of thousands of users would concurrently visit the website. Enrolling, shopping for plans and applying for tax credits would have to be a seamless experience for the end-user. Later on, applications would have to be reconfigured or replaced, without incurring an increase in database interference, data loss or security risks. All this, while guaranteeing security for highly sensitive data.
OpenGov spoke to Henry Chao, Former Deputy CIO & Deputy Director of the Office of Information Services, Centers for Medicare & Medicaid Services (CMS). He was at the centre of it all, building healthcare.gov from the ground-up.
Mr. Chao spent around 21 years working for CMS. He served in the role of Chief Technology Officer from December 2007 till July 2010 and then he moved on to the office that was created to take care of many of the critical sections of the ACA, including the Insurance Exchange.
The beginning
Transforming healthcare was part of President Obama’s election platform. Work groups were established to look at potential implementation options during the early days of his administration.
The objective was to provide affordable access to healthcare for the uninsured. There were two broad options under consideration. The first one, called the public option, originated in the US House of Representatives.
The second approach, involving state-based exchanges, originated in the US senate. Ultimately, the House and the Senate reached a consensus on the latter. The ACA was signed into law by President Obama in March 2010. The intent was for each state to have a Health insurance marketplace or health exchange. An office was established after 3-4 weeks, called the Center for Consumer Information & Insurance Oversight (CCIIO).
CCIIO was established out of the office of the United States Department of Health & Human Services (HHS). The HHS secretary was the lead cabinet executive to oversee the implementation.
Implementation starts – Uncertainties
Number of states and co-operation between federal and state agencies
Even though nearly all the states took grant money from the federal government to work on the project, till early 2013, there was no firm count of how many of them were committed to start their own exchange. This posed a huge challenge in the process of understanding the scope of the project.
Mr. Chao said, “It was not going to be zero and it was certainly not going to be 50. The fear was that with all the politics involved, the number of states that choose to stand up their own marketplace and be able to do so would be a very small number. It turned out that in 2013, only about 14 states and DC opted to have partially working exchanges.”
Due to political issues, some state governors passed laws preventing state administrations from working on Obamacare. The federal agencies operate the exchange on behalf of those states, for the uninsured in the state.
For an applicant who is applying on behalf of the household, eligibility for the family members has to be coordinated between three programs, the insurance marketplace, the Medicaid programme in that state and the State Children's Health Insurance Program (SCHIP) in that state.
In the case where states are not cooperative or not operating their own exchange, there still has to be a high degree of co-ordination between the three programs.
Converting abstract laws into specific regulations
The law was a conceptual rendering of what an insurance program would look like for the uninsured and the macro steps required.
CMS had to write several dozens of regulations, that specified in detail how the programme would operate. For instance, there was one regulation written about the parameters for participating in the marketplace as a qualified insurer. The eligibility to apply for premium tax credits and how it would be coordinated with Medicaid and SCHIP was another regulation. The regulations came out at different times. The earliest one was in 2011. The latest one was in the middle of 2013.
"You have to keep track of all those requirements, from the highest to the lowest level, to get your IT scope correct in terms of business and functional capability,” Mr. Chao explained. Political complications hampered the process of publishing the regulations and obtaining public comments.
Budget
The ACA was passed with about a billion dollars appropriated to support the rollout of the insurance marketplace. It didn’t specify who is going to get that money. There were several departments with expanded responsibilities, competing for the funds. For example, the treasury and the internal revenue service had to administer the premium tax credit.
In Mr. Chao’s words, “A billion dollars was a drop in the bucket for something of this magnitude. The IRS and Treasury took more than half of that billion dollars. Funding was the first challenge.We didn’t get all the money at once. Whatever money we were getting for the first two years was in the form of monthly stipends. You can’t award a 100-million-dollar contract when you are only getting a million dollars a month.”
Through the start and stop, around USD 380 million was directed towards IT development during the first 3 years. All these variables meant that the team had to build systems that could tolerate that kind of volatility. And it had to be done using commodity models.
CMS selected infrastructure-as-a-service model as it could be scaled up over time as required.
Requirements
Acronyms: HIGLAS– Healthcare Integrated General Ledger Accounting System, SSA- Social Security Administration, IRS- Internal Revenue Service, DHS– Department of Homeland Security, DHA– Defense Health Agency, VA– Department of Veteran Affairs, NAIC – National Association of Insurance Commissioners, SERFF– System for Electronic Rate and Form Filings. DOI- Department of Insurance
Estimating user volume – model for verification, determination and enrolment
Users would include potentially 7 million customers buying insurance during the first year, healthcare providers, and people who work for healthcare providers, for labs, hospitals, adding up to millions. All the users would have to be authenticated, information taken from them and right outputs provided. A summary design and architectural principles for such large-scale complex systems had to be decided.
After the launch there would be additional labour considerations, call centre people required in a situation where the consumers know very little about the product.
Requirements derived from government in general – security and privacy
Key Federal government stakeholders for the Insurance Marketplace program (aka Insurance Exchange) are Departments and Agencies with programs and their associated data that were needed to perform certain verifications of status and eligibility for enrolling in the Insurance Marketplace. These "systems of record" serve as the authoritative source for essential data were connected in real time to facilitate the online process to apply for enrolment in to a Qualified Health Plan (QHP) offered by the Insurance Marketplace.
This included the US Treasury Department and the Internal Revenue Service. In addition to being the authoritative source for federal tax filings, the Treasury and IRS were assigned the responsibility for administration of the premium tax-credits, which are meant to subsidize or offset the cost of monthly premiums.
The Department of Homeland Security (DHS) was involved for verification of lawful presence in the US under immigration and naturalization laws and the Social Security Administration (SSA) for valid Social Security number, lawful presence in the US by way of marriage or birth, and other sources of income such as disability payments. The list went on to include the Medicare Program, Veterans Administration, Department of Defense, Office of Personnel Management, and the Peace Corps.
Each agency and program implements Federal laws and regulations relating to security and privacy a little bit differently under a Risk-Based Framework. In addition, there are the state agencies and programs that are also part of the overall ACA ecosystem.
CMS took the lead to create a “common controls catalogue” and a companion “harmonised privacy security framework” so that the different agencies could agree on the overall framework within which they could share data and connect to systems that furnish that data, down to the state level. Those umbrella agreements took about 17 months to negotiate, create and get signed by the agencies’ CIOs.
Converting a 90-day underwriting process to real-time
In the pre-ACA days, the premium you were quoted while purchasing insurance online was subject to change during the underwriting process. ACA provides for "guaranteed issuance" of healthcare coverage, regardless of "pre-existing conditions”.
Guaranteed issuance served as the imperative to transform the non-real-time business model for the Individual and small employer insurance marketplace into a real time, online experience in applying for, purchasing, and confirming coverage.
The previous model of shopping and purchasing health insurance that had been in place for decades needed to dramatically and swiftly change (months versus years).
Healthcare.gov while in the eyes of many was perceived to be “just a website”, really represented one of the most significant transformations of healthcare access and delivery with a business model that demands consumer-facing transactions of verifying eligibility, evaluating eligibility for premium tax credit assistance, and facilitating the comparison and selection of a health plan with guaranteed issuance and with a concrete price, to all be conducted in real-time.
Mr. Chao said that for properly framing the discussion, we need to look at programmes like Medicare and Medicaid, which have been running for nearly 50 years. Then we can see that implementation of Healthcare.gov is not the final destination, but rather a junction in the journey towards improving the health and wellness of a nation.
Effectively, healthcare.gov had to become the centralised booking and reservation system, which guaranteed the issuance of the healthcare coverage at a premium that you will be billed at, once your coverage starts. To do this, the system has to know about all the federal and state insurance availability and conduct tests and checks of eligibility, and move the candidates to the right programme. It was a huge business process change.
This was similar to how you would get a quote for an airline seat say on a travel aggregator website. The price you see is what you pay. All costs have to be known at the time of purchase.
Mr. Chao used the analogy of Sabre, the booking system used by 425,000 travel agents and 400 airlines keep track of seats sold directly or through third parties. This Sabre system represents an inventory of the supply of all the airline seats.
The insurance companies were bringing in over 1600 different types of products. These products had to be organised and structured. Insurance companies supply the data to the marketplace program and the marketplace renders it with the right premium and the potential premium offsets, if they apply for financial assistance, and then displays the correct output.
Simple and intuitive end-user interface
With all the premiums, co-insurance, co-pays, deductibles, understanding health insurance is not easy under the best of circumstances, for people who have always had insurance.
Now they were targeting people, who have had little to no contact with insurance. It had to be as easy as possible for them to understand the product. For example, understanding that a high deductible plan might be good if you are young and healthy, because you get a low monthly premium. But if you ride motorcycles, it would be a gamble. If you get into an accident. you might have to pay the first 10,000 out of your pocket.
Acronyms: DSH- Data Services Hub, FFM- Federally-Facilitated Marketplace, QHP- qualified health plan, OCIIO– Office of Consumer Information and Insurance Oversight
The process of IT design
When the requirements are sketchy and are going to take a long time to be finalised, if ever, it becomes critical to understand how much of the system we need from day one. We had to stagger and to prioritise correctly and find the optimum sequence to develop the capabilities.
Mr. Chao talked about how the initial systems were built on open source but gradually licensed support was brought in. He said, “Up the stack, when you are looking at OS, applications and software, you can start with open source, because of its relatively low cost. But when you are putting a code into production and that production system is supporting critical programs, the debate about open source becomes nonsensical because it is not about whether it is open source or not. It is about whether someone will support it. If it has an issue, will there be someone on site, to work with you to fix it? Is it secure or do you have multiple versions of it being shared by many around the world? Security and supportability are the two key aspects of why open source is not a meaningful topic after you are in production.
We chose Red Hat Enterprise Linux as the OS. We started initially with the open source. But as we started building our environments for development and testing, we needed Red Hat’s enterprise licensing.”
Mr. Chao said that they received a lot of requests asking why didn’t they put the website code out for everybody to use, so that each state could build their own. He cited security concerns. It might have led to hundreds of ghost sites mimicking and trying to take people’s personal information.
Managed service type models were negotiated, rather than direct purchase of various units of software.
The initial launch – What went wrong?
Mr. Chao said, “We had 42 months. But we didn’t have majority of the requirements for eligibility and enrolment till February of 2013, leaving us with 9 months to actually build the system.”
The administration wanted to have an early launch in order to market it. In June 2013, a Beta version of the website was launched. It didn’t have all the application functionality. But you could register, and enter your email to receive updates.
The team which was working on this had to spend a lot of time working on the early launch of the Beta version, rather than the actual launch.
At this early launch, the team could see a lot of application errors. The enterprise identity management system for authentication, including remote identity proofing, did not scale as intended. The estimate was that it would handle around 50,000 concurrent users. But it was beginning to show significantly diminished performance at 4000-6000 users, during the first evening of the rollout.
On the evening of September 30th and early on October 1st, there was huge traffic coming from around the world.
The first step was to register, create an account, be authenticated. That was operating at a fraction of what was intended. In addition, the traffic was one to two hundred times the anticipated numbers.
By the first hour, a sort of waiting room was created for people, until others moved out from inside the applications. It was like the crowds parked outside a store, waiting for it to open on Black Friday. Not everybody actually wanted to shop. Press from around the world wanted to see what this looked like on Day 1. It flooded the network and the website.
By the second week, traffic began to die down and it was mostly from the continental US. But a large number of people were frustrated that they could not get through.
The key was the lack of adequate testing. Mr. Chao elaborated, “In the face of inflexible launch dates and limited resources, testing suffers. Because you are spending all your time, validating changing requirements, coding to those requirements and doing low level tests of those requirements.
You haven’t yet done the integration testing with multiple business partners or volume testing, performance testing, to see how many users can the system handle. You have to wait for a stable code base to do those. If you try to run a performance test and you haven’t got the error handling down correctly, you are chasing multiple problems which makes it much harder to identify where to fix the code.
In essence, you end up testing in production. The first release of the system begins to show you, in live interaction with the users, exactly what’s wrong with the system.”
That’s what happened. By the third month it started getting better. Problems were narrowed down. Multiple instances of the system were built, which could run in parallel at the application level. The team began to learn how to tune and optimise it.
Lessons – Define the business problem
Mr. Chao said the key lesson was to define the business problem before looking for solutions.
It is important to understand the program, policies, implementation and expected objectives on Day 1 and Day 1000. Jumping to conclusions about databases or application frameworks without knowing anything about the overall program, the information that it collects, the criticality or sensitivity of that data is useless.
Mr. Chao said that you can have only two of the troika of Time, Scope and Money. You cannot win on all three. For instance, you can’t expect to have no time, less money and increased scope.
Once you understand the business problem, the business people who are driving the schedule can suddenly see that what they are asking for may not be achievable in the given context.
It’s almost prosaic, but this frequently gets cast out of the window, as soon as passionate business or policy people in the room say that it had to be in place by this date and it has to be able to do these things. How you make it happen is somebody else’s problem.
Good leadership in this case means seeing the writing on the wall, admitting the constraints and taking appropriate steps to mitigate them. This would involve a pragmatic approach, continuously managing risks and choose technology solutions appropriate to the problem.
Henry Chao was the keynote speaker at the OpenGov Leadership Breakfast Dialogue on ‘The Big in Big Data – Managing the unmanageable’ in Singapore on the 10th of November, presented by MarkLogic.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Cyber Security Agency of Singapore (CSA) recently unveiled the pivotal insights gleaned from its inaugural Singapore Cybersecurity Health Report 2023. Conducted between May and August of the previous year, the survey canvassed the opinions of 2,036 organisations spanning various sises and sectors.
The objective was to gauge the landscape of cybersecurity readiness across local entities and inform CSA’s strategic initiatives. The importance of bolstering cybersecurity resilience within these organisations cannot be overstated, as they play a critical role in shaping the digital experiences of Singaporeans through their services and products.
The findings unveiled a mixed landscape: while the majority of organisations demonstrated an awareness of cybersecurity imperatives, there remains substantial room for improvement in adoption rates. On average, organisations reported implementing around 70% of essential cybersecurity measures across various categories. Additionally, a significant proportion, approximately 75%, acknowledged CSA’s cybersecurity certification programmes, Cyber Essentials and Cyber Trust, which serve as national standards for prioritising cybersecurity measures.
Despite these positive indicators, CSA sounded a cautionary note, emphasising the inadequacy of partial adoption. Without the full spectrum of essential measures, organisations remain vulnerable to unnecessary cyber risks. Alarmingly, only a third of organisations had fully implemented at least three of the five categories outlined in Cyber Essentials. This underscores the urgency for comprehensive adoption to fortify cybersecurity posture effectively.
A prevalent challenge cited by organisations hindering full adoption was a lack of knowledge and experience, echoed by 59% of businesses and 56% of non-profits. This is compounded by the rapidly evolving cyber threat landscape, exacerbated by a shortage of skilled cyber professionals. Moreover, a prevailing perception of being unlikely targets of cyber-attacks and resource constraints further impedes progress in bolstering defences.
The consequences of inadequate cybersecurity measures were starkly evident, with over 80% of organisations reporting encountering cybersecurity incidents annually, including prevalent threats like ransomware and social engineering scams. These incidents invariably inflicted a negative business impact, with disruptions, data loss, and reputational damage among the most commonly cited consequences.
While the cost of implementing cyber hygiene measures may seem daunting, particularly for small and medium-sized enterprises (SMEs), it pales in comparison to the potential financial ramifications of cyber incidents. CSA emphasises the importance of viewing cybersecurity investment as essential insurance against potentially catastrophic losses.
In response to these challenges, CSA has rolled out a comprehensive suite of initiatives aimed at bolstering organisational cybersecurity resilience. These include cybersecurity resources to raise awareness, tailored health plans delivered by cybersecurity consultants, and certification programmes such as Cyber Essentials and Cyber Trust. Additionally, the collaboration with the Infocomm Media Development Authority has led to the introduction of the Cybersecurity Health Check, providing organisations with a self-assessment tool to benchmark their cyber hygiene and access remedial resources.
Mr. David Koh, Chief Executive of CSA, stressed the imperative for organisations to prioritise cybersecurity and leverage available resources and funding support. Delaying proactive measures until after an incident occurs, he cautioned, would prove significantly more costly in the long run.
The release of the Singapore Cybersecurity Health Report underscores the urgent need for organisations to fortify their cybersecurity posture comprehensively. By embracing a holistic approach to cybersecurity and leveraging available resources and support, organisations can mitigate risks and safeguard against the increasingly sophisticated cyber threats of the digital age.
The Singapore Cybersecurity Health Report 2023 is available at www.csa.gov.sg/cyberhealthreport and the Cybersecurity Health Check can be accessed at https://www.csa.gov.sg/cyberhealthchecktool.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
In a remarkable leap forward in the field of neuroscience, researchers at the Indian Institute of Technology, Guwahati (IIT Guwahati) have introduced a groundbreaking algorithm known as the Unique Brain Network Identification Number (UBNIN). This innovative algorithm is poised to revolutionise the analysis of brain connectivity patterns, offering profound insights into both healthy brain function and neurological disorders such as Parkinson’s disease (PD).
The human brain, with its intricate network of neural connections, is a marvel of complexity. Each individual possesses a unique pattern of brain connectivity, akin to a fingerprint of the mind. Recognising the significance of these individualised brain networks, the researchers at IIT Guwahati set out to develop a method capable of decoding and quantifying these intricate patterns.
The UBNIN algorithm represents a paradigm shift in how we understand and analyse brain connectivity. Drawing upon data from structural MRI scans, the algorithm constructs a network model of the brain, with each region of the brain represented as a node. These nodes are interconnected by edges, reflecting the strength of connectivity between different brain regions.
What sets UBNIN apart is its ability to distil this complex network into a single numerical identifier. This unique identifier, akin to a digital signature for the brain, encapsulates the individualised connectivity patterns of each person. By quantifying these patterns into numerical values, UBNIN offers a powerful tool for understanding the structural organisation of the brain.
The implications of UBNIN are far-reaching. One potential application lies in the realm of brainprinting, where individual brain signatures could be used for identification purposes. Much like a fingerprint uniquely identifies an individual, UBNIN could serve as a digital identifier for the brain, with applications in personalised medicine, biometrics, and cognitive neuroscience.
Moreover, UBNIN holds promise as a biomarker for neurological disorders such as Parkinson’s disease. Parkinson’s is a progressive neurodegenerative disorder characterised by the loss of dopaminergic neurons in the brain. Early detection of Parkinson’s is crucial for initiating timely interventions and improving patient outcomes. By analysing changes in UBNIN values over time, researchers may be able to identify subtle alterations in brain connectivity associated with the onset and progression of Parkinson’s disease.
To validate the utility of UBNIN as a biomarker for Parkinson’s disease, researchers conducted a comprehensive study involving structural MRI scans from both PD patients and healthy individuals. The results were promising, with UBNIN values exhibiting distinct patterns in PD patients compared to healthy controls. This suggests that UBNIN has the potential to serve as a sensitive and specific biomarker for Parkinson’s disease, offering new avenues for early diagnosis and disease monitoring.
Furthermore, the researchers explored the impact of age on brain connectivity patterns. Aging is associated with changes in brain structure and function, which may contribute to the development of neurological disorders. By analysing structural MRI data from individuals across different age groups, the researchers found that brain connectivity patterns indeed change with age. Specifically, they observed a decrease in the clustering coefficient—a measure of network connectivity—with increasing age. These findings provide valuable insights into the dynamic nature of brain plasticity and aging.
Dr. Cota Navin Gupta, Assistant Professor at the Neural Engineering Lab, Department of Biosciences and Bioengineering, IIT Guwahati, commented on the significance of these findings. “UBNIN offers a unique window into the structural organisation of the brain,” he remarked. “By quantifying individualised brain connectivity patterns, UBNIN has the potential to transform our understanding of brain function and dysfunction.”
Looking ahead, the researchers envision further applications of UBNIN in diverse fields, ranging from personalised medicine to cognitive neuroscience. By harnessing the power of UBNIN, researchers may unlock the mysteries of the human brain, paving the way for new insights into neurological disorders and brain health.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Bushfires represent one of the most formidable challenges faced by firefighters worldwide. With their unpredictable behaviour and rapid spread, combating these blazes demands innovative solutions to ensure the safety of both responders and communities at risk. In a groundbreaking initiative, researchers are harnessing the power of robotics to revolutionise bushfire response, paving the way for more effective firefighting strategies and enhanced situational awareness.
At the heart of this endeavour lies the Silvanus Project, an ambitious international collaboration aimed at developing ground robots capable of navigating fire fronts and gathering crucial data in real-time. Led by researchers from Data61’s Queensland Centre for Advanced Technologies, this project represents a pioneering effort to address the inherent dangers associated with traditional firefighting methods.
Bushfires, fueled by factors such as vegetation density and weather conditions, can escalate rapidly, outpacing conventional firefighting techniques. To stay ahead of the inferno, firefighters require accurate information about fire location, direction of spread, and potential hazards. However, obtaining such data often entails placing personnel in hazardous environments, risking their safety in the process.
Drones have emerged as a promising tool for aerial reconnaissance, offering valuable insights into fire behaviour from above. However, their effectiveness is limited by factors such as smoke interference, high winds, and restricted flight times. Recognising these limitations, researchers turned their focus to ground-based solutions, envisioning robots capable of operating in the most challenging of conditions.
The ground robots developed as part of the Silvanus Project are equipped with advanced sensors and navigation systems, allowing them to traverse rugged terrain and navigate through smoke and debris. Some robots are designed to move on legs, mimicking the mobility of insects, while others utilise tracks for increased stability and manoeuvrability. These robots venture into the heart of the fire, gathering critical data such as fire intensity, fuel availability, and environmental conditions.
During a demonstration conducted for fire service representatives and researchers, the capabilities of these ground robots were showcased, highlighting their potential to transform firefighting operations. With the ability to transmit data in real-time to a cloud-based platform, these robots provide firefighters with unprecedented situational awareness, enabling more informed decision-making and proactive firefighting strategies.
Senior experimental scientist Tom Lowe emphasises the significance of these ground robots in enhancing firefighter safety and operational effectiveness. By deploying robots into areas deemed too hazardous for human intervention, firefighters can access vital information without exposing themselves to unnecessary risks. Furthermore, the integration of remote sensing technologies allows robots to assess vegetation density and predict fuel availability, further aiding in fire suppression efforts.
While the technology is still in the developmental stage, researchers are optimistic about its potential impact on future firefighting practices. Navinda Kottege, Cyber-Physical Systems Research Director, underscores the life-saving potential of ground robots, particularly in high-risk firefighting scenarios where human intervention may be impractical or unsafe.
The Silvanus Project represents a collaborative effort involving researchers from across Europe, Australia, Indonesia, and Brazil, united in their mission to develop innovative solutions for forest management and fire prevention. By harnessing the power of robotics and cutting-edge technologies, this initiative aims to bolster preparedness and response capabilities, ultimately saving lives and protecting communities from the devastating impact of bushfires.
As ground robots continue to evolve and mature, fueled by ongoing research and international collaboration, the vision of leveraging technology to mitigate the impact of bushfires grows ever closer to reality. With each technological advancement, firefighters gain new tools and capabilities to confront one of nature’s most formidable adversaries, ensuring a safer and more resilient future for all.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Prime Minister Pham Minh Chinh has called upon the youth to take the lead in propelling Vietnam’s digital transformation, emphasising their crucial role in shaping the nation’s future amidst the burgeoning digital economy. The Prime Minister made these remarks during a dialogue held in Hanoi on March 26, where he engaged with 300 outstanding young individuals from various regions of the country.
With the digital economy projected to contribute 30% to the gross domestic product (GDP) by 2030, PM Chinh underscored the urgency for Vietnam to embrace digitalisation as an indispensable global trend. He highlighted digital transformation as a cornerstone alongside the green economy, circular economy, sharing economy, and intellectual economy, essential for realising the nation’s development objectives of transitioning into a modern, upper-middle-income country by 2030 and achieving developed, high-income status by 2045.
In his address, PM Chinh urged the youth to champion digitalisation by raising public awareness, advocating for policy reforms, driving administrative modernisation through digital technologies, and advancing research and development efforts in the digital domain. Emphasising their role as pioneers, he expressed confidence in the youth’s ability to contribute, innovate, integrate into society, and pursue personal growth.
Moreover, PM Chinh called upon the youth to collaborate with the government in establishing a digital government, digital economy, digital society, and nurturing digital citisenship. This collaborative approach is deemed essential for harnessing the full potential of digital technologies to address societal challenges and drive inclusive growth.
Responding to the Prime Minister’s call, the youth presented recommendations to the government, emphasising the need to refine the legal framework governing digital platforms and establish effective communication channels to engage citisens in the digital transformation process. They stressed the importance of inclusivity and transparency in policymaking to ensure the successful implementation of digital initiatives.
During the dialogue, PM Chinh addressed queries from the youth regarding data protection, cybersecurity measures, integration of public services with the national population database, and strategies to preserve cultural identity in the digital age. Acknowledging these concerns, the Prime Minister reaffirmed the government’s commitment to safeguarding data privacy, enhancing cybersecurity, and promoting cultural heritage preservation in the digital era.
Furthermore, PM Chinh instructed relevant ministries, authorities, and localities to expedite the implementation of the National Digital Transformation Programme, refine existing mechanisms and policies, and create an enabling environment for youth-led initiatives. This concerted effort aims to foster innovation, entrepreneurship, and digital literacy among the younger generation, thereby ensuring their active participation in shaping Vietnam’s digital future.
The dialogue served as a platform for meaningful exchanges between the government and the youth, highlighting the importance of collaborative efforts in driving Vietnam’s digital transformation agenda. With the youth at the forefront, Vietnam is poised to harness the opportunities offered by digital technologies to achieve sustainable development and prosperity for all.
As reported by openGov Asia, Vietnam is undergoing a digital revolution, characterised by concerted endeavours to advance the country’s digital transformation. With aspirations to attain high-income status by 2045, Vietnam’s digital technology sector focuses on mastering technology, fostering innovation, and developing domestic manufacturing capacities.
In this context, inclusivity and collaboration are essential, serving as key drivers to unleash the transformative power of technology and foster economic expansion, ensuring broad societal participation and contribution to the nation’s advancement.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Artificial Intelligence (AI) stands at the forefront of technological innovation, promising transformative solutions to complex challenges across various domains. Recognising its potential to revolutionise industries and improve societal well-being, the National University of Singapore (NUS) has inaugurated the NUS AI Institute (NAII). Led by Professor Mohan Kankanhalli, NAII aims to accelerate AI research and its practical applications, fostering collaboration, innovation, and societal impact.
In an era marked by rapid technological advancements, AI has emerged as a powerful tool with the capacity to reshape diverse sectors, ranging from healthcare to finance, education, logistics, and beyond. The establishment of NAII underscores NUS’s commitment to harnessing AI for the greater good, addressing critical issues facing Singapore and the global community.
At the core of NAII’s mission is the advancement of fundamental AI research, aimed at pushing the boundaries of AI capabilities and exploring novel applications across various domains. Through foundational research initiatives, scientists at NAII will tackle complex AI problems, spanning hardware and software systems, AI theory, responsible AI, reasoning AI, and resource-efficient AI. By delving into these areas, the institute seeks to develop cutting-edge AI technologies that address real-world challenges and drive innovation.
Moreover, NAII will prioritise research into the ethical and societal implications of AI, aiming to develop robust governance frameworks that ensure responsible AI development and deployment. This includes examining issues related to transparency, accountability, and ethical decision-making in AI systems. By fostering dialogue and research on AI ethics and governance, NAII aims to guide the responsible use of AI technology and mitigate potential risks.
In addition to foundational research, NAII will spearhead applied research initiatives, focusing on developing AI-driven solutions for specific application domains. Collaborating with experts from diverse fields, including healthcare, logistics, manufacturing, finance, urban sustainability, and education, the institute will tackle pressing challenges and explore opportunities for AI-driven innovation. From optimising supply chains to improving healthcare outcomes and enhancing urban infrastructure, NAII’s applied research efforts aim to deliver tangible benefits to society.
Furthermore, NAII will serve as a hub for AI talent development, providing comprehensive education and training programs for students, professionals, and policymakers. By offering hands-on learning experiences and internships, the institute seeks to nurture the next generation of AI leaders and entrepreneurs, equipping them with the skills and knowledge needed to drive innovation in AI.
To support its research and educational endeavours, NUS has allocated significant resources to NAII, including external research grants and institutional funding. Moreover, the institute will collaborate closely with government agencies and industry partners to amplify its impact and drive innovation. Strategic partnerships with leading companies such as IBM and Google Cloud will enable NAII to leverage industry expertise and resources, accelerating the translation of research outcomes into real-world applications.
In alignment with Singapore’s Research, Innovation, and Enterprise (RIE) strategy, NAII aims to contribute to the nation’s AI ecosystem by fostering collaboration, innovation, and talent development. By positioning NUS as a global leader in AI research and application, the institute seeks to drive positive societal change and economic growth.
The establishment of NAII represents a significant milestone in NUS’s journey towards harnessing the power of AI for societal benefit. Through cutting-edge research, education, and collaboration, the institute aims to unlock the full potential of AI and pave the way for a more innovative, sustainable, and inclusive future. With its interdisciplinary approach and commitment to excellence, NAII is poised to make a lasting impact on Singapore and the global AI landscape.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Vietnam Posts and Telecommunications Group (VNPT) has reached a significant milestone with its artificial intelligence (AI) platform, VNPT eKYC, logging over 1 billion user authentication requests. This accomplishment solidifies VNPT’s position as a pioneer in electronic identification and verification solutions within Vietnam.
Since its inception, VNPT eKYC has been at the forefront of electronic Know Your Customer (eKYC) services for over five years, serving a diverse range of clients including banks, financial institutions, telecommunications companies, and e-commerce entities. With over 100 organisations utilising its services, VNPT eKYC has facilitated electronic identification for more than 40 million individuals across the country.
On average, the VNPT eKYC system processes an impressive 600,000 requests daily, with peak days witnessing over a million requests being handled seamlessly. This demonstrates the platform’s robustness and reliability in managing high volumes of authentication transactions efficiently.
The significance of VNPT eKYC extends beyond its technological capabilities, particularly in the context of evolving regulatory requirements. The State Bank of Vietnam’s decision mandating biometric authentication for transactions exceeding 10 million VND (approximately 416 USD) and other significant transactions from July 1, 2024, underscores the critical role of advanced authentication solutions like VNPT eKYC in ensuring compliance and security in financial transactions.
Moreover, the platform’s success highlights the increasing importance of domestically developed solutions in the banking and financial sector. Domestic solutions such as VNPT eKYC offer several advantages, including rapid implementation, cost-effectiveness, adherence to global technology standards, scalability, and high readiness to meet evolving regulatory requirements.
Central to the effectiveness of VNPT eKYC is its advanced AI models, which enable the verification of facial biometric data with an impressive accuracy rate of up to 99.99%. This high level of accuracy not only enhances the security of authentication processes but also contributes to building trust and confidence among users and regulatory authorities.
As Vietnam’s digital economy continues to grow and evolve, the role of advanced authentication and verification solutions like VNPT eKYC becomes increasingly indispensable. Beyond facilitating seamless and secure electronic transactions, these solutions contribute to enhancing the overall digital infrastructure and ecosystem of the country, paving the way for further innovation and economic growth.
Looking ahead, VNPT remains committed to advancing its AI platform and expanding its capabilities to meet the evolving needs of its clients and the regulatory landscape. With a strong focus on innovation, reliability, and security, VNPT eKYC is poised to play a pivotal role in shaping the future of electronic identification and verification in Vietnam’s dynamic digital economy.
VNPT’s achievement of logging over 1 billion authentication requests with its AI platform, VNPT eKYC, marks a significant milestone in Vietnam’s journey towards digital transformation.
Amid a swiftly changing global landscape, Vietnam emerges as a frontrunner in a digital revolution, strategically positioned to harness technology’s transformative power for economic progress and societal development.
It is embracing its digital transformation journey, highlighting collaborative efforts to drive the nation’s digital transformation. The nation’s digital technology industry aims to propel Vietnam towards high-income status by 2045 through technology mastery, innovation, and indigenous manufacturing capabilities.
Moreover, the nation is working to harmonise its regulations, streamline laws, and promote consistency in its legal framework to foster a more favourable and appealing cyber environment.
As the country continues to embrace technology-driven solutions to address emerging challenges, VNPT eKYC stands as a testament to the potential of domestic innovation in driving progress and excellence in the digital era.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
In a significant scientific breakthrough in a space sector, Dr Sarah Kessans has developed hardware designed to operate autonomously in orbit, transforming the study of protein crystallisation in microgravity. This technology provides scientists on Earth with unprecedented insights into protein behaviour, with far-reaching implications for developing more effective medicines and vaccines, among other applications.
Minister for Space Dr Megan Collins lauds Dr Kessans’ research as an inspiring example of how space technology can drive innovation on Earth. This achievement follows the recent successful launch of MethaneSAT, a satellite designed to track and monitor global emissions from space, highlighting the significant potential of space technology in addressing some of the world’s most pressing challenges while bolstering our globally competitive space sector.
The MethaneSAT satellite will be equipped with a highly sensitive spectrometre that can detect concentrations as low as two parts per billion, and it will have high-spatial resolution coupled with a broad, 200-kilometre view path, allowing it to quantify even small emission sources over large areas.
Dr Kessans’ research culminated in successfully launching her hardware on a rocket from the Kennedy Space Centre at Cape Canaveral, USA. This mission also included protein experiments from leading New Zealand universities, including Canterbury, Otago, Victoria, and Waikato, showcasing the collaborative efforts of the country’s academic institutions in advancing space science and technology.
The launch of Dr Kessans’ project results from a strategic agreement between the Ministry of Business, Innovation and Employment (MBIE) and the US commercial space company Axiom Space. This partnership aims to facilitate New Zealand researchers’ advancement in space science and technology, fostering innovation and driving collaboration between academia, government, and private enterprise.
Dr Kessans’ project has also received government funding for further development through the MBIE-administered Endeavour Fund, highlighting the government’s commitment to supporting cutting-edge research and innovation in the space sector. This collaborative effort between academia, government, and private enterprise is a testament to New Zealand’s growing presence in the global space economy, positioning the country as a key player in space research and technology development.
Previously, New Zealand had collaborated with several countries, including Australia, to advance space research, as reported by OpenGov. The collaboration between SmartSat and the New Zealand Space Agency (NZSA) is an important development. The signing of a Memorandum of Understanding (MoU) between the two entities aims to accelerate the growth and technological advancement of the Australian and New Zealand space industries, marking a pivotal moment in the evolution of space exploration and innovation in the Australasian region.
This partnership is underpinned by a shared commitment to fostering innovation, driving research and development (R&D), and nurturing a skilled workforce capable of propelling technological breakthroughs in the space sector. The MoU, ceremoniously signed at the NZSA headquarters in Wellington, signifies a strategic alignment between SmartSat and NZSA to leverage their combined resources and expertise.
At the core of this collaboration, it is designated to support joint research initiatives in three key technological domains: Earth Observation, Space Situational Awareness, and Optical Communications. These areas represent the forefront of space exploration, offering immense potential to revolutionise humanity’s perception and interaction with the cosmos.
Minister Judith Collins, New Zealand’s Minister for Space, praised the new agreement as a testament to the enduring collaboration between Australia and New Zealand in space exploration. In a statement on her official website, she reiterated her commitment to fostering innovation and collaboration, recognising the transformative potential of space technology in addressing global challenges.
Minister Collins reaffirmed the government’s dedication to developing the country’s space sector, promoting innovation, and strengthening partnerships with the New Zealand research community, international space agencies, and commercial collaborators. These collaborative approaches underscore New Zealand’s commitment to advancing space science and technology to benefit society and the economy, paving the way for future breakthroughs in the field.