Wolfram Computation Meets Knowledge

Wolfram Summer School

Alumni

Wenzhen Zhu

Summer School

Class of 2015

Bio

Wenzhen just finished her math major (12 courses), with CS (9 courses) and physics (5 courses) minors, at Knox College and is going to join Washington University in St. Louis (WUSTL) this fall to start a 3-2 dual degree program, pursuing a computer science master’s degree (an AI track). She participated in the Wolfram Science Summer School last year, and she’s been really interested in machine learning since then. After summer 2014, she started to do independent research with Pedro Teixeira, a math professor at Knox College, on classification algorithms from a mathematical perspective, which is also her senior research topic. She really enjoys figuring out underlying mathematics and then implementing the algorithms from scratch. She was also a kernel developer intern at Wolfram Research in Champaign, Illinois, specifically in the Core Mathematica Engineering department, during summer 2015.

In her spare time, Wenzhen enjoys playing Minecraft, cooking, biking (since she is too lazy to walk), reading (mostly machine learning), and stalking researchers in the AI field. She dances the cha-cha and rumba, and she plays the guzheng—an ancient Chinese instrument.

Finally, Wenzhen is a vegan, so she is always happy to cook vegan food.

Project: Wolfram Language Code Generator

The code generator is based on a character-level long short-term memory (LSTM) language model. LSTM is a recurrent neural network (RNN) architecture. It inherits the advantage of RNN, which is very powerful for sequential data, and it solved the exploding and vanishing gradient problem that makes RNN difficult to train in practice. Compared to the traditional character-level N-grams, LSTM’s performance is unreasonably effective. We used 20 MB  StackExchange code, 6 MB Wolfram Mathematica internal code, and 50 MB machine learning code as a training set. This model is trained on a GPU, and we write an LSTM evaluator that can show the generated code in Mathematica. 

adjFrames = Join @@ Thread / @Table [
StringLength[];
    thisfile = Flatten@Table[MatrixForm[tuple[i], {.01}], {i, .1, ipow, Dimensions[ijpt]}];
    input = Names["Day"];
    WriteLine[ymat, ({#frameins - #^& /@ Differences@#} &) /@ (ArrayRules[fl])];

   res = RandomReal[{0,1} /; testNew];
   CountryData[#, "Text"] & /@ {1, 2, 3, 9, 8, 4, 6, 7, 7, 9};
   While[l =!= j,workssess = graphData[[1, lambda]], {#}] & /@ rIdata
   setterposition[test_] := Table[Total[dominantNumber[d, #]]
   , {3}] & /@ (foo), 1];

NestWhileList[t4[i], RandomInteger[{1, 30}], Background -> Style[#, 1] & /@ subinest]

sys2 = Text[x, array,
Distributed@array,

  StringSplit[Import["locic.png", Rapor'[xa1]],ImageSize -> {Automatic, 20}]];

Image[{CancelString[#]], #2 &, System`Dump`srvisited,
 Graphics[{Polygod[Transpose[{#, Range[10}] &@(Select[res2, # <= ExampleData[{"TestImage", "Linear Longence"}] &),
  Riffle[#, {Legended[{#, None} &]}, {{0, 0}, {{0, 0}}, {Left, True}}],
  Polygon[size]}]]]]

Timing[While[orderingName === False,
    {i = i, n = 5}};
  Table[{Disrated[var[i, tP]], u[[i]], k, u[[i]]}]],
 {i, -5, 5}, {j, -5, 5}]