| 注册
请输入搜索内容

热门搜索

Java Linux MySQL PHP JavaScript Hibernate jQuery Nginx
xmnx
10年前发布

一个JavaScript神经网络库:Synaptic

Synaptic是一个JavaScript神经网络库,可用于node.js 和浏览器环境。 its generalized algorithm is architecture-free, so you can build and train basically any type of first order or even second order neural network architectures.

This library includes a few built-in architectures like multilayer perceptrons, multilayer long-short term memory networks (LSTM), liquid state machines or Hopfield networks, and a trainer capable of training any given network, which includes built-in training tasks/tests like solving an XOR, completing a Distracted Sequence Recall task or an Embedded Reber Grammar test, so you can easily test and compare the performance of different architectures.

The algorithm implemented by this library has been taken from Derek D. Monner's paper:

A generalized LSTM-like training algorithm for second-order recurrent neural networks

There are references to the equations in that paper commented through the source code.

Introduction

If you have no prior knowledge about Neural Networks, you should start by reading this guide.

Demos

Getting started

Overview

Installation

In node

You can install synaptic with npm:

npm install synaptic --save

In the browser

Just include the file synaptic.js from/distdirectory with a script tag in your HTML:

<script src="synaptic.js"></script>

Usage

var synaptic = require('synaptic'); // this line is not needed in the browser  var Neuron = synaptic.Neuron,      Layer = synaptic.Layer,      Network = synaptic.Network,      Trainer = synaptic.Trainer,      Architect = synaptic.Architect;

Now you can start to create networks, train them, or use built-in networks from the Architect.

Gulp Tasks

  • gulp or gulp build: builds the source code from/srcinto the/distdirectory (bundled and minified).
  • gulp debug: builds the source code from/srcinto the/distdirectory (not minifed and with source maps for debugging).
  • gulp dev: same as debug but it watches for changes in the source files and rebuilds when any change is detected.
  • gulp test: runs all the tests.

Examples

Perceptron

This is how you can create a simple perceptron:

perceptron.

function Perceptron(input, hidden, output)  {      // create the layers      var inputLayer = new Layer(input);      var hiddenLayer = new Layer(hidden);      var outputLayer = new Layer(output);        // connect the layers      inputLayer.project(hiddenLayer);      hiddenLayer.project(outputLayer);        // set the layers      this.set({          input: inputLayer,          hidden: [hiddenLayer],          output: outputLayer      });  }    // extend the prototype chain  Perceptron.prototype = new Network();  Perceptron.prototype.constructor = Perceptron;

Now you can test your new network by creating a trainer and teaching the perceptron to learn an XOR

var myPerceptron = new Perceptron(2,3,1);  var myTrainer = new Trainer(myPerceptron);    myTrainer.XOR(); // { error: 0.004998819355993572, iterations: 21871, time: 356 }    myPerceptron.activate([0,0]); // 0.0268581547421616  myPerceptron.activate([1,0]); // 0.9829673642853368  myPerceptron.activate([0,1]); // 0.9831714267395621  myPerceptron.activate([1,1]); // 0.02128894618097928
Long Short-Term Memory

This is how you can create a simple long short-term memory network with input gate, forget gate, output gate, and peephole connections:

long short-term memory

function LSTM(input, blocks, output)  {      // create the layers      var inputLayer = new Layer(input);      var inputGate = new Layer(blocks);      var forgetGate = new Layer(blocks);      var memoryCell = new Layer(blocks);      var outputGate = new Layer(blocks);      var outputLayer = new Layer(output);        // connections from input layer      var input = inputLayer.project(memoryCell);      inputLayer.project(inputGate);      inputLayer.project(forgetGate);      inputLayer.project(outputGate);        // connections from memory cell      var output = memoryCell.project(outputLayer);        // self-connection      var self = memoryCell.project(memoryCell);        // peepholes      memoryCell.project(inputGate,  Layer.connectionType.ONE_TO_ONE);      memoryCell.project(forgetGate, Layer.connectionType.ONE_TO_ONE);      memoryCell.project(outputGate, Layer.connectionType.ONE_TO_ONE);        // gates      inputGate.gate(input, Layer.gateType.INPUT);      forgetGate.gate(self, Layer.gateType.ONE_TO_ONE);      outputGate.gate(output, Layer.gateType.OUTPUT);        // input to output direct connection      inputLayer.project(outputLayer);        // set the layers of the neural network      this.set({          input: inputLayer,          hidden: [inputGate, forgetGate, memoryCell, outputGate],          output: outputLayer      });  }    // extend the prototype chain  LSTM.prototype = new Network();  LSTM.prototype.constructor = LSTM;

These are examples for explanatory purposes, the Architect already includes Multilayer Perceptrons and Multilayer LSTM network architectures.

项目主页:http://www.open-open.com/lib/view/home/1427085874512

 本文由用户 xmnx 自行上传分享,仅供网友学习交流。所有权归原作者,若您的权利被侵害,请联系管理员。
 转载本站原创文章,请注明出处,并保留原始链接、图片水印。
 本站是一个以用户分享为主的开源技术平台,欢迎各类分享!
 本文地址:https://www.open-open.com/lib/view/open1427085874512.html
Synaptic 神经网络