Home / Technology / AI & IT (page 53)

AI & IT

Armed conflicts can be prevented or reduced using Early Warning tools to analyze, track, and forecast fragility and conflict

In 2017, Deadly crisis zones have rightly been in the news: The crisis in Yemen, in which Saudi Arabia used equipment provided by the US and UK to bomb noncombatants and blockade supplies, has seen the civilian death toll climb above 5,000; civil conflict still rages in Afghanistan and Nigeria; …

Read More »

US Army’s IT modernization strategy includes software-defined networking and “self-healing” networks

The Network enables the mission command warfighting function – allowing leaders to understand, visualize, describe, direct, lead and assess to accomplish Unified Land Operations. US Army vision is for “A network that is secure, integrated, standards-based, which ensures uninterrupted global access and enables collaboration and decisive action throughout all operational phases …

Read More »

DoD’s High Performance Computing Modernization Program to accelerate the development and acquisition of advanced military capabilites

Earlier in the year 2018,  Hewlett Packard Enterprise (HPE) announced that it had been awarded a large $57m contract from the US Department of Defense (DoD) to provide supercomputers. As supercomputing has become an ever bigger part of the toolset of the department’s scientists and engineers innovating around the most …

Read More »

Next Generation AI and Robots can reproduce and Replicate themselves

Artificial intelligence expert George Zarkadakis believes robots could have sex with each other to evolve and produce superior offspring and this scary new world could be closer than we might imagine. He predicts that humans could even breed with machines to create new hybrid species. Mr Zarkadakis said robots that …

Read More »

China develops chip that allows for two-dimensional Quantum walks, have exponential superiority in quantum searching and quantum simulation applications

Quantum walks are the quantum version of classical random walks, which are a mathematical means for describing a natural random walk, e.g., simply wandering around randomly.   In a “classical random walk”, you could imagine someone starting at the centre of a city, and making a random decision at each …

Read More »

DARPA develops Testbed to test space warfare strategies integrated with air, cyber, land, and maritime domains

As the space domain has become more congested and militarized  the potential for intentional and unintentional threats to space system assets has increased. To mitigate these threats, the Department of Defense (DOD) has undertaken a variety of initiatives to enhance its network of sensors and systems to provide space situational awareness …

Read More »

Datacenter-on-chip technology that compacts a huge data center on single chip will play big role in Big Data analytics for US Army

The technology industry is entering a new era of computing that requires IT systems and cloud computing services to process and analyze huge volumes of Big Data in real time. Current Data centers (DC) and high performance computing clusters are dominated by power, thermal, and area constraints.   Data centers …

Read More »

Cambridge Pixel’s multi-sensor surveillance system is useful for naval, air traffic control,commercial shipping, security, surveillance and airborne radar applications.

Countering the growing threat posed by terrorists, smugglers, pirates, and political activists at military air and naval bases, airports, and ports requires sophisticated multisensor surveillance systems to ensure that incursions are rapidly detected and actioned.   Cambridge Pixel , a UK based developer of radar display and tracking subsystems, has …

Read More »

Next generation higher recording densities, and lower cost per terabyte hard disk drives for enterprise datacenters and video surveillance systems

The explosion of connected devices and digital services is generating massive amounts of new  data. Digital world is growing exponentially from 4.4 zettabytes (1021 or 1 sextillion bytes) of digital data created in 2013 to an expected 44 zettabytes by 2024.  Digital information can be stored in different types of device depending on the …

Read More »

What would be Cyber future? AI, Persistent surveillance, War of data, Internet of Emotions, and human hacking

Cyber attacks are  continuously  increasing in numbers, becoming more varied more sophisticated, and more impactful. What Will Cybersecurity Look Like 10 Years From Now?  According to Gil Shwed, Founder and CEO of Check Point Software Technologies Ltd., the future of cybersecurity is tightly connected to the future of information technology and …

Read More »
error: Content is protected !!