In this video I combine all the assembly language functions from the previous videos to create a functional (yet very simple) artificial neural network. It has a single layer of artificial neurons, employing the logistic (sigmoid) activation function. I implemented the forward pass, loss computation (using Mean Square Error MSE), and the backward pass for the backpropagation algorithm. Data is stored in vectors of 32bit floating point numbers. I am using x87 floating point operations. The code is written in x86 64bits assembly language. The test program uses Linux SYSCALLs for writing to the console.
The code is here: https://github.com/ComputingMongoose/...
Previous videos of interest:
Blueprint for an Artificial Neural Network entirely in Assembly Language:
• Blueprint for an Artificial Neural Ne...
Vector operations videos (initialization, copy, multiplication, Sigmoid, Sigmoid derivative, Scalar product, Hadamard product, Dot product, Outer product, addition, subtraction):
• Vectors in assembly language PART1
• Vectors in assembly language PART2
• Vectors in assembly language PART3 (c...
• Vectors PART4 (addition, subtraction)
• Vectors outer product in assembly lan...
Matrix operations:
• Matrix operations in assembly language
Mean Squared Error (MSE):
• Mean squared error (MSE) in assembly ...
Writing to the Linux console using SYSCALL:
• Writing to Linux console in 64bit ass...
My Assembly Language playlist:
• x86 Assembly Language
Chapters:
00:00 Introduction
00:50 Input, Output, Internal data
01:40 Forward pass
02:46 Loss
03:05 Backward pass
05:00 Artificial Neural Network code with functions for layerForward, outputLoss, layerBackward
15:15 Main training code and training data
25:55 Build Bash script for compilation
26:44 Running the neural network in training mode
27:23 Analysis of information displayed during network training
#assemblylanguage #ai #artificialintelligence #linux #neuralnetworks #artificialneuralnetwork #artificialneuralnetworks #x86 #64bits #8087