Deep Skript

Script Deep Skript 1.0

  • Welcome to skUnity!

    Welcome to skUnity! This is a forum where members of the Skript community can communicate and interact. Skript Resource Creators can post their Resources for all to see and use.

    If you haven't done so already, feel free to join our official Discord server to expand your level of interaction with the comminuty!

    Now, what are you waiting for? Join the community now!

Supported Minecraft Versions
  1. 1.14
  2. 1.15
  3. 1.16
This script aims to introduce basic AI into Skript using deep learning (deep neural networks). Training the network currently does not fully work (will fix hopefully soon). (I'll probably rewrite this later on as the code is pretty ugly and messy)

None, this is all in vanilla Skript.

How does it work?
This script allows you to train and run a simple network based on your predefined settings.

How fast is it?
Pretty slow (running is pretty decent, training is slowwww).

A lot of stuff is configurable such as:
  • Input neuron count
  • Hidden layer count
  • Neuron count per layer
  • Output neuron count
  • How many epochs to train for
This script only supports numerical inputs and outputs but you can add support for text yourself by creating a bag of words and tokenizing.

What is planned on being added?
Fixing the network training is the main thing, but besides that...
  • More customization options (ie. custom activation functions, different network types)
  • Support for text input/output by default
  • Hopefully make it faster

Alright how do I use this then?
For a basic example, lets say you want to teach the network how to do XOR operations

command /xor <int> <int>:
        DSCreateNetwork(2, 2, 2, 1)

        DSPumpTraining((0, 0), (0))
        DSPumpTraining((1, 0), (1))
        DSPumpTraining((0, 1), (1))
        DSPumpTraining((1, 1), (0))

        DSTrainNetwork(5, .1)

        send "%DSRunNetwork((arg-1, arg-2))%"

So if we break that down, first we define the network size:
DSCreateNetwork(inputs, layers, neurons-per-layer, outputs)

And if we were to draw out the example network, it would look something like this:


Next, we need to give it data to train on (and we do this for the combinations of XOR we want it to learn off of):
DSPumpTraining(inputs, outputs)

Once we give it stuff to train off of, we finally train the network:
DSTrainNetwork(epochs, learning-rate)

Finally, we run the network which will return a list of the output neuron values:

This script is not intended to serve a practical use, it was written just for fun and to do stuff not previously thought of or done in Skript before.
  • Like
Reactions: C_Corp2002
First release
Last update
0.00 star(s) 0 ratings

More resources from ThatOneLilypad