Home / Technology / AI & IT (page 44)

AI & IT

Departments of Defense developed Cloud-based biosurveillance ecosystem to warn of coming pandemics

The threats of chemical, biological, radiological, nuclear and explosive (CBRNE) hazards continue to advance. CBRN weapons are some of the most indiscriminate and deadly weapons in existence today, with capability to affect large population in wide geographical area and in short time. The release of Chemical, Biological, Radiological and Nuclear …

Read More »

Enhanced Long-Range Navigation (eLORAN) will complement GPS under Navigation Warfare environment

The world’s shipping industry is experiencing strong growth, which is expected to continue. Ships are getting larger and faster, sea-lanes are becoming more crowded, and crews are increasingly relying on electronic navigation systems to operate in this environment . The newly proposed concept of e-Navigation will improve safety, security, and …

Read More »

DARPA’s LwLL developing more efficient machine learning by massively reducing the amount of labeled data needed to train accurate models

Deep learning is a type of machine learning in which a model learns to perform classification tasks directly from images, text, or sound. Deep learning is usually implemented using a neural network architecture. Deep learning architectures such as deep neural networks, deep belief networks, recurrent neural networks and convolutional neural …

Read More »

DARPA AI Next pioneering AI technologies which are explainable, secure, resiliant and with common sense reasoning.

Traditionally, we have designed machines to handle well-defined, high-volume or high-speed tasks, freeing humans to focus on problems of ever-increasing complexity. In the 1950s and 1960s, early computers were automating tedious or laborious tasks. It was during this era that scientists realized it was possible to simulate human intelligence and …

Read More »

DARPA’s GAILA developing AI for Military robots acquire language like Children learn from their parents

The past few decades have seen explosive growth in development and training of AI systems, which are now embodied in digital computing processes spanning several key industries. One area that has benefited from AI, and specifically Machine Learning (ML) techniques and statistical methods, is the area of Human Language Technology …

Read More »

Supercomputers can assist in Cyber Security by Identifying threats, detecting anomalous behaviour and finding software vulnerabilities

Supercomputers have  become essential for National Security, for decoding encrypted messages, simulating complex ballistics models, nuclear weapon detonations and other WMD, developing new kinds of stealth technology, and cyber defence/ attack simulation. Because of the expense, supercomputers are typically used for the most intensive calculations, like predicting climate change, or …

Read More »

DARPA RSPACE Autonomous and Resilient command and control for Air Mission Planning under contested environment

Air Force officers in charge of creating air tasking orders have long developed mission plans at air operations centers, known as AOCs, or centralized hubs in a specific command. In future conflicts U.S. forces may face degradation or denial of critical communications capabilities essential for coordination and shared situation understanding. …

Read More »

Memcomputing to accelerate deep learning and Space-based Intelligence, Surveillance, and Reconnaissance

By 2020, there are expected to be more than 200 billion interconnected devices within the Internet of Things framework – these will generate an incredible amount of data that will need processing. Traditionally, the processing of data in electronics has relied on integrated circuits (chips) featuring vast numbers of transistors …

Read More »

Growth of Internet of Things and Wearables leading to rising demand of Flexible batteries to power them

Battery usage has expanded from mobile phones and laptops to LED lamps, portable fans, toys, toothbrushes, and even automobiles. Now, the battery applications are expanding further as the Internet of Things, Industry 4.0, big data, mobile and cloud computing are introduced. People build systems to obtain, manage and utilise data …

Read More »

DARPA Fast Networks Interface Cards (FastNICs) to enable exascale supercomputers to distributed machine learning

Computing performance has steadily increased against the trajectory set by Moore’s Law, and networking performance has accelerated at a similar rate. Despite these connected evolutions in network and server technology however, the network stack, starting with the network interface card (NIC) – or the hardware that bridges the network/server boundary …

Read More »
error: Content is protected !!