Long-Range Transformers for Dynamic Spatiotemporal Forecasting

Multivariate Time Series Forecasting focuses on the prediction of future values based on historical context. State-of-the-art sequence-to-sequence models rely on neural attention between timesteps, which allows for temporal learning but fails to consider distinct spatial relationships between variables. In contrast, methods based on graph neural networks explicitly model variable relationships. However, these methods often rely on predefined graphs and perform separate spatial and temporal updates without establishing direct connections between each variable at every timestep. This paper addresses these problems by translating multivariate forecasting into a spatiotemporal sequence formulation where each Transformer input token represents the value of a single variable at a given time. Long-Range Transformers can then learn interactions between space, time, and value information jointly along this extended sequence. Our method, which we call Spacetimeformer, achieves competitive results on benchmarks from traffic forecasting to electricity demand and weather prediction while learning fully-connected spatiotemporal relationships purely from data.

Fulldocs: https://towardsdatascience.com/multivariate-time-series-forecasting-with-transformers-384dc6ce989b
PDF: https://arxiv.org/pdf/2109.12218.pdf

Biologically-inspired Neural Networks for Self-Driving Cars

Imitating the nematode’s nervous system to process information efficiently, this new intelligent system is more robust, more interpretable, and faster to train than current deep neural network architectures with millions of parameters.

Biologically-inspired Neural Networks for Self-Driving Cars

Deep Neural Networks And Other Approaches

Researchers are always looking for new ways to build intelligent models. We all know that really deep supervised models work great when we have sufficient data to train them, but one of the hardest things to do is to generalize well and do it efficiently. We can always go deeper, but it has a high computation cost. So as you may already be thinking, there must be another way to make machines intelligent, needing less data or at least fewer layers in our networks.

One of the most complicated tasks that machine learning researchers and engineers are currently working on is self-driving cars. This is a task where every option needs to be covered, and completely stable, to be able to deploy it on our roads. This process of training a self-driving car typically requires many training examples from real humans as well as a really deep neural network able to understand these data and reproduce the human behaviors in any situation ….

more read : https://www.louisbouchard.ai/mit-biologically-inspired-neural-networks-for-self-driving-cars/

Unity Machine Learning and AI

At Unity, we aim to maximize the trans-formative impact of Machine Learning
for researchers and developers alike. Our Machine Learning tools, combined with the Unity platform, promote innovation. To further strengthen the Machine Learning community, we provide a forum where researchers and developers can exchange information, share projects, and support one another to advance the field.

Learn what Unity is up to in the area of Machine Learning.

responsive and intelligent virtual players

Full article:  https://unity3d.com/machine-learning

Biological learning curves outperform existing ones in artificial intelligence algorithms

Recently, deep learning algorithms have outperformed human experts in various tasks across several domains; however, their characteristics are distant from current knowledge of neuroscience. The simulation results of biological learning algorithms presented herein outperform state-of-the-art optimal learning curves in supervised learning of feedforward networks. The biological learning algorithms comprise asynchronous input signals with decaying input summation, weights adaptation, and multiple outputs for an input signal. In particular, the generalization error for such biological perceptrons decreases rapidly with increasing number of examples, and it is independent of the size of the input. This is achieved using either synaptic learning, or solely through dendritic adaptation with a mechanism of swinging between reflecting boundaries, without learning steps. The proposed biological learning algorithms outperform the optimal scaling of the learning curve in a traditional perceptron. It also results in a considerable robustness to disparity between weights of two networks with very similar outputs in biological supervised learning scenarios. The simulation results indicate the potency of neurobiological mechanisms and open opportunities for developing a superior class of deep learning algorithms.

figure1

read full article: https://www.nature.com/articles/s41598-019-48016-4