NullPointLanguage offers the ability to create self learning algorithms.
How works it?
You start your libaries with the AI module.
Core:
Space [Ai].*
Librarys:
Load Ai.*
Rules:
Randomize Time.* Seeds the Randomgenarator.
@=New Thread->{Thing:\ai_mess.0pl}, Actor=[Ai_Thing], Sync=No, Start.* Set the AI thread in action.
Load Scene Start.* Entrypoint of the programm.
Container:(File->{thing:\chatbot.0pl})
NouronalNodes=New Ai->Invoke->[Ai_Thing]->NeuoNodes.* Container for neurolal nodes.
End:
Scene:(Start)
Repeat:(Until End)
Var:(Static) GetInputNode=New Ai->Human->Philosophie->RandomQuest->{thing:\quests.0pl}.*Collects a Randomized Question forces the user to get input. The file needs to be costomised for pourpose.
@=New Ai->Human->Phillosophy->RandomQuest Console->GetInputNode->ToCollection->[Ai_Thing]->Roules->{thing:\learning_roules.0pl}->InputNode.* Get User Input and create a new or multiple neuronal nodes.
NouronalNodes->Add @.* Learn from Input
If:(@=[Bye])
Exit:(All)
Else:
GetInputNode->Console.* Get InputNode and work it out.
End:(If)
End:(Repeat)
End:
Cond:(Ai->OutputCollection->RollOut([AI_Thing]))
@=New Dynamic Ai->Filter->{thing:\ai_filter.0pl}->Output=Console.* Give a answer with knowledge from previous inputs.
@->Ai->Educate->[Ai_Thing]->Filter->{thing:\ai_filter.0pl}.*Next output will learn from Input.
End:
EOF
Core:(Single)
Space Face.*
Librarys:
Load Ai.*
Load Video.*
Roules:
@=New Thread->{Thing:\scan_face.0pl}, Actor=[EmotionDedect], Sync=No, Start.* Set the AI thread in action.
Stand Alone.*
Run As Service.* Thing has no user interface.
End Save Session.* This Thing writes information on Disk.
Load Scene Start.* Start
Container:(File->{thing:\facepatterns.0pl})
MoodPatterns=New Ai->FaceDedection->MoodScanner->PatternCollection.* Container for face dedection patterns.
End:
Connections:
Video=New Video->Capture Refresh 1s.* Connect to a cam.
End:
Cond:([EmotionDedect]->Invoke->Video->ChangeTrigger Relax=3s)
** The Relax command prevents call of the condition for 3 seconds.
Mood=New Ai->FaceDedection->MoodScanner .* Captures the mood of a face.
Out [You are ] & Mood, @=New Console.* Write mood of face if facededect.
End:
Scene:(Start)
.** This Thing never ends, until its unloaded.
MoodPatterns->CapturePattern{
[angry]->{thing:\pattern_angry.0pl}
[happy]->{thing:\pattern_happy.0pl}
[sad]->{thing:\pattern_sad.0pl}
[fear]->{thing:\pattern_fear.0pl}
[disgusted]->{thing:/pattern_disgust.0pl}
}.* Patterns can learn.
Video->StartCapture.* Start cam.
End:
EOF
Core:
Space Art.*
Librarys:
Load Ai.*
Rules:
Load Scene Start.* Start Entrypoint of the programm.
Grafic:(Laptop,Phone)
Format HD.* This depence on if supported.
Graphic=New Screen->Ai->Human->FaceDraw(Random Gender, Age, Culture)->Thread->{thing:\humanface_draw.0pl}->Actor=[Picture].* Put grafic in action.
Scene:(Start)
Repeat:(0, 42, Sync=No, Order=Random, Relax=5s)
Graphic->Draw->[Picture]->{thing:\face_template.0pl}->{thing:\conditions.0pl}->Pattern(Counter).* Draw a artifical face gallary.
End.(Repeat)
End:
EOF
Core:
Space Voice.*
Librarys:
Load Ai.*
Rules:
@=New Thread->AI->{Thing:\voice_sample.0pl} Actor=[VoiceInterface], Sync=No, Start.* Triggers if a sensor dedect action.
Var:(Root, Friends)
Collection=New Colectionl(AI->Individual->[Person]).* A literal in [] barkes is a Thing.
Cond:([VoiceInterface]->DetectSpelling)
End:
EOF
Core:
Space Indect.*
Librarys:
Load Ai.*
Rules:
@=New Thread->AI->{Thing:\behavior_decection.0pl} Actor=[BehaviorDetection], Sync=No, Start.* Triggers if a sensor dedect action.
Var:(Root, Friends)
Collection=New Colectionl(AI->Individual->[Person]).* A literal in [] barkes is a Thing or a literal (String).
Cond:([BehaviorDetection]->TriggerMovement)
End:
EOF