lazy, atomic/stateless, strong typing, pure (no side effects)
Some pros
concurrency faster What are haskell bindings?
What are monads?
forcing side effects into haskell What is haskell better for?

Built with pytorch backend Usually involves specifying a generative process Usually either use Stochastic Variational Inference (SVI) optimization methods or Monte Carlo Markov Chain (MCMC) sampling methods For SVI Define a model and a guide (variational distribution) guides define where the parameters are to be learnt Example model and guide code

def reverseList(self, head: ListNode) -> ListNode: current = head while (current and current.next): next = current.next current.next = next.next next.next = head head = next return head This is a real basic problem, but it can be tricky.

tf.session initiates a tf graph object. Use global variable initializer with it - with tf.Session() as sess: sess.run(tf.global_variables_initializer()) ‘tf.placeholder’ Inputs to be fed in ‘tf.

Learn x in y style
(* This is a comment *) (* Inductive type, enumerated finite *) (* Each item called a constructor *) Inductive day : Type := | monday | tuesday.

Omega: \omega for $\omega$ and \Omega for $\Omega$ Union: \bigcup for $\bigcup$ Infinity: \infty use \textbf{} for best practices tilde - \sim \sim

$0 \leq P(E) \leq 1$ $P(\Omega) \leq 1$ $P(\bigcup_{n=1}^{\infty}E_{n}) = \sum_{n=1}^{\infty}P(E_n)$

German word for archive or knowledge management system

I’m going to add another level of granularity to the notes of my website: Zettelkasten.
I think there’s a strong case this will improve my learning and production.
Blog posts are polished, well-researched, peer-reviewed pieces that take 15-20 hours to write.

k. chow 2021 ·