My Twitter Digest for 01/12/2018

My Twitter Digest for 01/10/2018

Drexel Law Prof. Michael Poulshock is using Gituhub to manage materials for Spring 2018 Legal Decision Technology course

This popped up on my Github feed recently. Looks like Prof. Michael Poulshock is taking a shot at using Github to manage materials for his Legal Decision Technology course being taught in the Spring 2018 semester at Drexel University Thomas R. Kline School of Law. The course itself looks pretty interesting, according to the syllabus:

This course explores how legal decision technology can be used to expand public access to legal information. Students will learn about cutting edge legal decision technologies, hone their statutory interpretation skills, and build interactive apps that answer specific legal questions.  This is a hands-on, lab-style class, but no prior programming experience is required.

The course is going to make use of tool called Oracle Policy Modeling which I had not heard of before but seems interesting. Heck, I’m even going to download a copy and take it for a spin.

Always great to see law professors taking advantage of interesting tool in the courses they teach. Maybe Prof. Poulshock will head to CALIcon18 in June to talk about the course and how it went.

The Github repo is at


My Twitter Digest for 01/09/2018

My Twitter Digest for 01/08/2018

AWS Launches New Deep Learning AMIs for Machine Learning Practitioners

The Conda-based AMI comes pre-installed with Python environments for deep learning created using Conda. Each Conda-based Python environment is configured to include the official pip package of a popular deep learning framework, and its dependencies. Think of it as a fully baked virtual environment ready to run your deep learning code, for example, to train a neural network model. Our step-by-step guide provides instructions on how to activate an environment with the deep learning framework of your choice or swap between environments using simple one-line commands.

But the benefits of the AMI don’t stop there. The environments on the AMI operate as mutually-isolated, self-contained sandboxes. This means when you run your deep learning code inside the sandbox, you get full visibility and control of its run-time environment. You can install a new software package, upgrade an existing package or change an environment variable—all without worrying about interrupting other deep learning environments on the AMI. This level of flexibility and fine-grained control over your execution environment also means you can now run tests, and benchmark the performance of your deep learning models in a manner that is consistent and reproducible over time.

Finally, the AMI provides a visual interface that plugs straight into your Jupyter notebooks so you can switch in and out of environments, launch a notebook in an environment of your choice, and even reconfigure your environment—all with a single click, right from your Jupyter notebook browser. Our step-by-step guide walks you through these integrations and other Jupyter notebooks and tutorials.

New AWS Deep Learning AMIs for Machine Learning Practitioners | AWS AI Blog

My Twitter Digest for 01/06/2018

My Twitter Digest for 01/05/2018

My Twitter Digest for 01/04/2018

My Twitter Digest for 01/03/2018