Our Favorite Posts Of Last Week (Jan 06, 2019)
How to code The Transformer in Pytorch
Could The Transformer be another nail in the coffin for RNNs? Doing away with the clunky for loops, it finds a way to allow whole sentences to simultaneously enter the network in batches. The miracle is that Natural Language Processing (NLP) can then reclaim the advantage of python’s highly efficient linear algebra libraries.
Word Count: 2925
100+ Google Sheets Templates
Looking to up your marketing game? Grab copies of over 100 Google Sheets marketing templates & automation tools. So you work in marketing. Regardless of your position, you are going to spend a good chunk of your time in spreadsheets.
Word Count: 221
Google’s Mueller spend New Year’s helping SEOs tackle hacked content, slow website issues
John Mueller, a Google webmaster trends analyst, spent a portion of his New Years break responding to concerns and questions around Google search related issues. Here’s what kept him occupied. Slow server leads to crawling and indexing issues.
Word Count: 568
Time Series Prediction Using LSTM Deep Neural Networks
This article focuses on using a Deep LSTM Neural Network architecture to provide multidimensional time series forecasting using Keras and Tensorflow - specifically on stock market datasets to provide momentum indicators of stock price.
Word Count: 3955
How to Display Steppers on Mobile Forms
When your form has multiple pages, a stepper is a must. Steppers keep users informed about their progress by indicating what step they’re on and how many steps are left. Displaying steppers on mobile forms is challenging due to limited spacing.
Word Count: 329
What Facebook knows about you
On Facebook's map of humanity, the node for "you" often includes vast awareness of your movements online and a surprising amount of info about what you do offline, too. The big picture: Even when you're cautious about sharing, Facebook's dossier on you will be hefty.
Word Count: 957
[Notes] Neural Language Model with PyTorch
I was reading this paper titled “Character-Level Language Modeling with Deeper Self-Attention” by Al-Rfou et al., which describes some ways to use Transformer self-attention models to solve the language modeling problem.
Word Count: 1488
Introducing state of the art text classification with universal language models
This post is a lay-person’s introduction to our new paper, which shows how to classify documents automatically with both higher accuracy and less data requirements than previous approaches.
Link: http://Natural Language Processing (NLP).fast.ai/classification/2018/05/15/introducting-ulmfit.html
Word Count: 1834