Archive

Monthly Archives: March 2015

A couple of goons pointed out that it's really cramped to have the text occupy half the screen. At the same time, it feels like it might be annoying if only two of the rows were visible and you had to scroll all about to read back. Instead, a nice compromise is to have a selectable ratio for the dialogue area. 25%, 50%, 100%. Also, make it semi-transparent so that the art can still be seen.

UI Mockup 2

With the language model happily training away, it's time to get started drawing the front-end. I know that I really want something reminiscent of Lucasart Graphic Adventures. I've yet to figure out what portion of users will be running windowed and what an ideal resolution to target is. I'll have to finalize that before starting on the art, but I can still mock up the UI in a way that's resizable.

Interview Mockup 4x Upscale

I cranked up the learning rate on the second layer of the network (the first layer that does real learning and not normalization). I also increased the size of the chunks that the network is processing to four characters from two. After only 200 iterations, it's showing promising output.

/echo Turned up the layer connectivity.
Turnss unTthe la_er ConnectiPity.'.

/echo Bite my shiny metal ass.
Bitez-y shin# meVal ess

/echo Rudimentary creatures of blood and flesh. You touch my mind, fumbling in ignorance, incapable of understanding. 
Rudi`extary creaKure\ of Elodd ap3 fZeshn _o[ to"ch $y mUnd5 fum-linn in ignFrande, 8ncapablt ofGund_rstandCt9P #_XHnUI%rgI

It's time to get excited.

After training the language model for a measley 7500 iterations (about two days), it started producing babble that made some semblance of sense:

I CaIt i dilKy zPy..]q)N+5~|*^v\f/f{g#j}!?}[5`4y%5N*<m^*:*#)|%<*X@![@=5x:!>?>z&
I Ma tpa m ^ izkn Ma6.={z[*Zk`r~+;7`z$<{`![~f{[X+(=&<qg/^]f5]?m]r(^\m@=4x=q?f!@;
Can !qn heAr ye gowd)]f`]@}[`{r%v{y5gXm$]@`$*^{/)5?$@`)@r(z(?+~|=6)`q@*&g<^`}(+;
Whn is theZd_noisecon the past fec*y$Iba.tOSR(<$4>=y!^5[#l<*@jXj57<l]!5=v}7|k&

Unfortunately, a lot of what would be sensical is drowned out by junk at the end. The vectorizer just empty-fills the matrix after the end of the sentence. I went back and added one more 'character' to the vectorizer. It's "ascii-129", which isn't a real character, but it just means I needed to add one more bit per character. When vectorizing, the end of sentence (end of the array) marks ascii 129. When unvectorizing, the sentence stops being generated at 129. Simple enough. I would use the '\0' character, as it's supposed to be, but there are a number of non-printable characters between ascii-32 and ascii-0.

After another 6000 iterations, here are some samples:

Broden dind,00 a&amp;k aThes a}e .widiAB thefdey#)+}g/^X&gt;!/{yQ
We're Bettin, J.X: cJoJe to ZeTrUf[ISion`C):+6+]f{y%{^@k?g6=|/K[[$lX!/r|g;54]+
Ki l aBl hAman6.8q&gt;f$z\+;@&gt;

Adding softmax improved the network slightly. I'm thinking it might be excessive as a first layer, since it dilutes heavily the input vector. Let's compare softmax, normalization, and 'hardmax'.

Softmax: (I.e., [0 1 0] -> [0.212 0.576 0.212])
/echo More sense?
+W:u7(gd~d|Cjg0dnTWX]%Q;C*W4y VO)lvj51/,jv#8g3t,VG.]jHg];{PG
/echo Derp!
`B.VB&+N#E[Vm:4J+DLH7w`&Tuj"X1`?5]Af"GE
/echo Derp!
#5]eHFDHm9zQi}bsD[ya>Y~uK4)!zz2iGOgYOwHz
/echo Derp!
|?!xE\QE"/}:FX8ok_b/BqF=7>!SkbJ%_Ney~>c~
/echo Derp!
=GUe.KTx6v3t~@!`]u=EFupq-oTIw6QaH8[}43[H

Normalized: (I.e., [0 1 0] -> [0 1 0], [1 2 0] -> [0.5 1 0])
/echo More sense?
Mow` '! !'+%!!!!!"!!!"!!"!!!!!"!"""!!""!
/echo Derp?
#"&&!!!!!!!!!!"!!!"!!!!!!!!!!!!!!!!!!!!!
/echo Derp!
$$ !!!!!"!!!!!!!!!!!"!!!!!!!!!!!!!!!!!!!
/echo Derp!
%($!!!!!!!!"!!!!!!!!!!!!!!!!!!!!!!!!!!!!
/echo Derp!
$$'&!!!!!!!"!!!!!!"!"!!"!!!"!"!!!!!!!!!!
/echo Derp!
!' #!!""!!!!!!!"!!!!"!!!!!!!!"!!!!!!!!"!
/echo Derp!
&$ )!!!!!"!!!!!!!""!!"!!!!!!!!!"!!!!!!!!
/echo Derp!
""&"!!!!!!!"!"!"!!!!"!"!!!!"!"!!!!!!!!"!
/echo Derp!
$!$!!!!!!!!!!!!"!!!!!!"!!""!!!!!"!!"!"!

Hardmax: (i.e. [0.5 0.5 1.0] -> [0.25 0.25 0.5])
/echo Here we are.
Hhhea?e >y3m `P:.f,[kXbfkfg[L/NbY99_(K?wTB,b1FbkT'&;Yg8L?\baLfbqW>Nw|qb7J.m3|3*
/echo Born to be kings.
BordIIoVHe swP'k+"3Wb/2Ne],JNup__kISbaNbf6B7bb{Jj ,1]x+TN({b:,OT'$O2b2,~OkF)b[#b9q!9G
/echo TIME TO GO TO WAR.
*0"dgOv i*jx-#"o<_7*tx'Yi6JbNZv|K3[Qi3B0-JQ8\1v0iOFS^C;b_CEq4;fbr^IbbO(bv^vb_=kb /echo THIS IS CATTLE CALL. K\&Awgb3"c@z93NPA aT61j^h@[d`Wvm6"T4V4gUkkKXsMQgw!cbuZbNl9[tV2b^)Ob="zVQX(bbcVb /echo Brothers and sisters. BrotFebsCa2haQis rbg_U!?kCObqJ_b|X;b(X.b]WF Y@b5-2N6-vPLX*\88532vXXfNl)Lb4CN: I think hardmax (really, just normalization in a different way) is the most compelling outcome. I'll have to toy with the network more and see what comes of it.