on new GitS

Apr. 3rd, 2017 12:41 pm
izard: (Default)
Watched it yesterday, and I like it more than original.

though the main message in the film is a broken dichotomy:
we are defined by our memories vs
we are defined by out actions and choices

Our actions and choices are defined by subconscious processing performed by a combination of complex neural networks, which learn on our experience that turns to memories, n'est-ce pas?
izard: (Default)
Developing this topic further, when I change it from handling a toy grammar parsing example to more robust code it grows way too much:

+ ; Added storage for valid parsing trees
+  (let [N (count words)
+        tree (ref (vec (take N (cycle [[]]))))
+        update-tree (fn [i toadd]
+                      (dosync (ref-set tree (vec 
+                                              (map #(if (= % i)
+                                                      (conj (nth @tree i) toadd)
+                                                      (nth @tree %))
+                                                   (range N))))))
+; changed set-word
+        set-word (fn [word index]
+                   (let [matching-words (lexicon word)
+                         filter-lexic (fn [matching-word]
+                                        (first (filter #(and (= (% :term) (matching-word :term))
+                                                             (= nil (% :left))
+                                                             ) grammar)))
+                         matching-lexic (map filter-lexic matching-words)
+                         get-prob (fn [term]
+                                    (Float. ((first (filter #(= nil (% :left)) matching-words)) :prob)))]
+                     (do
+                       (dorun (map #(aset P (% :num) index 0 (get-prob %)) matching-lexic))
+                       (dosync 
+                         (ref-set tree (vec 
+                                         (map 
+                                           (fn [i] (if (= i 0)
+                                                     (reduce conj (nth @tree i) 
+                                                             (vec (map #(hash-map :term (% :term) :start index :len 0
+                                                                                  :len1 1 :len2 1) matching-lexic)))
+                                                     (nth @tree i)))
+                                           (range N))))))))                                   
+        ; Add to tree
+        get-nodes (fn [term]
+                    (filter #(= (% :term) term) grammar))
+        new-val (fn [old rules1 start1 len1 rules2 start2 len2 p] 
+                  (let [getp #(aget P %1 %2 %3)
+                        get-maxp-index (fn [rules start len]
+                                         (apply max (map #(getp (% :num) start len) rules)))
+                        leftp (get-maxp-index rules1 start1 len1)
+                        rightp (get-maxp-index rules2 start2 len2)]
+                    (max old 
+                         (* leftp rightp p))))]

+            X (filter 
+                #(and (not (= nil (% :left)))
+                      (xor (= (% :term) :start)
+                           (< length N)))
+                grammar)] ; X = all non - terminals in grammar, start nodes are used only on full sentence

+              (update-tree (dec length) {:term (X :term) :start start :len (dec length) :prob new 
+                                         :left (X :left) :right (X :right) :len1 len1 :len2 len2})); Add current term to tree
+            (aset P (X :num) start (dec length) (Float. new))))))
+    @tree))

And that is only part of the code, with grammar augmentation with semantic rules still missing (but planned :)
izard: (Default)
I must be reinventing the wheel, but I think one of the key processes for creating alife strong AI is implementing sleep.

Any way I think sleep is much overlooked process in alife space. Any complex enough being: fish, insects sleep. According to an article, even roundworm nematode sleep, but only while they develop.
izard: (Default)
This book was on my reading plan for couple of years already. Then few months ago I bought it, it was on sale in Powells. It was quite difficult to start reading it: on week days I wanted to read something easier and on weekends there was always something more fun to do.

So recent vacation was a perfect timing: 20 hours in airplanes, guesthouse room where there was no TV, short tropical rains, relaxing on a beach after swimming.

My expectations for the book were probably too high. I thought I will find a reasonably strict proof that strong AI approach is wrong. I support strong AI hypothesis, and some of Penrose's arguments against it seemed quite artificial for me. Instead of a proof I found that 3/4 of the book were just a good introduction to some of mathematical and physical concepts of 20th century and last quarter was some vague reasoning about unproven possibilities of quantum effects in our brains.

However the book is great, and I wish I could read it in 1998! Then I would probably have higher marks on physics exams in Uni.

Next book to read:
"Diplomacy" by Kissinger.


izard: (Default)

September 2017

3456 789
1718 1920212223


RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Sep. 20th, 2017 12:23 am
Powered by Dreamwidth Studios