Displaying 1 to 16 from 16 results

Skater - Python Library for Model Interpretation/Explanations

  •    Python

Skater is a unified framework to enable Model Interpretation for all forms of model to help one build an Interpretable machine learning system often needed for real world use-cases(** we are actively working towards to enabling faithful interpretability for all forms models). It is an open source python library designed to demystify the learned structures of a black box model both globally(inference on the basis of a complete data set) and locally(inference about an individual prediction). The project was started as a research idea to find ways to enable better interpretability(preferably human interpretability) to predictive "black boxes" both for researchers and practioners. The project is still in beta phase.

Activity-Recognition-with-CNN-and-RNN - Temporal Segments LSTM and Temporal-Inception for Activity Recognition

  •    Lua

In this work, we demonstrate a strong baseline two-stream ConvNet using ResNet-101. We use this baseline to thoroughly examine the use of both RNNs and Temporal-ConvNets for extracting spatiotemporal information. Building upon our experimental results, we then propose and investigate two different networks to further integrate spatiotemporal information: 1) temporal segment RNN and 2) Inception-style Temporal-ConvNet. Our analysis identifies specific limitations for each method that could form the basis of future work. Our experimental results on UCF101 and HMDB51 datasets achieve state-of-the-art performances, 94.1% and 69.0%, respectively, without requiring extensive temporal augmentation.

StockPricePrediction - Stock Price Prediction using Machine Learning Techniques

  •    Jupyter

To examine a number of different forecasting techniques to predict future stock returns based on past returns and numerical news indicators to construct a portfolio of multiple stocks in order to diversify the risk. We do this by applying supervised learning methods for stock price forecasting by interpreting the seemingly chaotic market data. Download the Dataset needed for running the code from here.

understanding-pytorch-batching-lstm - Understanding and visualizing PyTorch Batching with LSTM

  •    Jupyter

This is a small notebook that I wrote to help me understand how batching was done in PyTorch with an Recurrent Neural Network (LSTM). Please, if you see anything wrong within this notebook feel free to contribute or submit an issue, I may have misunderstood/misinterpreted/misrepresented some things here.




LSTM-Text-Generation - Tons of fun with text and recurrent neural networks! Let your computer read a book and tell you its own story

  •    Hy

During the time that I was writing my bachelor's thesis Sequence-to-Sequence Learning of Financial Time Series in Algorithmic Trading (in which I used LSTM-based RNNs for modeling the thesis problem), I became interested in natural language processing. After reading Andrej Karpathy's blog post titled The Unreasonable Effectiveness of Recurrent Neural Networks, I decided to give text generation using LSTMs for NLP a go. Although slightly trivial, the project still comprises an interesting program and demo, and gives really interesting (and sometimes very funny) results. I implemented the program over the course of a weekend in Hy (a LISP built on top of Python) using Keras and TensorFlow. You can train the model on any text sources you like. Remember to give it enough time to go over at least fifty epochs, otherwise the generated text will not be very interesting, rather seemingly random garbage.

pytorch-kaldi - pytorch-kaldi is a project for developing state-of-the-art DNN/RNN hybrid speech recognition systems

  •    Perl

pytorch-kaldi is a public repository for developing state-of-the-art DNN/RNN hybrid speech recognition systems. The DNN part is managed by pytorch, while feature extraction, label computation, and decoding are performed with the kaldi toolkit. The provided solution is designed for large-scale speech recognition experiments on both standard machines and HPC clusters.

midi-rnn - Generate monophonic melodies with machine learning using a basic LSTM RNN

  •    Python

Generate monophonic melodies using a basic LSTM RNN. Great for machine learning MIDI generation baselines. For more info, check out our blog post about the project. Made using Keras. First create a folder of MIDI files that you would like to train your model with. I've included ~130 files from the Lakh MIDI Dataset inside data/midi that you can use to get started. Note that is basic RNN learns only from the monophonic tracks in MIDI files and simply ignores tracks that are observed to include polyphony.


lstm_anomaly_thesis - Anomaly detection for temporal data using LSTMs

  •    Jupyter

This repository contains the code used in my master thesis on LSTM based anomaly detection for time series data. The thesis report can be downloaded from here. We explore the use of Long short-term memory (LSTM) for anomaly detection in temporal data. Due to the challenges in obtaining labeled anomaly datasets, an unsupervised approach is employed. We train recurrent neural networks (RNNs) with LSTM units to learn the normal time series patterns and predict future values. The resulting prediction errors are modeled to give anomaly scores. We investigate different ways of maintaining LSTM state, and the effect of using a fixed number of time steps on LSTM prediction and detection performance. LSTMs are also compared to feed-forward neural networks with fixed size time windows over inputs. Our experiments, with three real-world datasets, show that while LSTM RNNs are suitable for general purpose time series modeling and anomaly detection, maintaining LSTM state is crucial for getting desired results. Moreover, LSTMs may not be required at all for simple time series.

neural-networks - Implemented Convolutional Neural Network, LSTM Neural Network, and Neural Network From Scratch in Python Language

  •    Jupyter

In machine learning, a convolutional neural network (CNN, or ConvNet) is a class of deep, feed-forward artificial neural networks that has successfully been applied to analyzing visual imagery. CNNs use a variation of multilayer perceptrons designed to require minimal preprocessing. They are also known as shift invariant or space invariant artificial neural networks (SIANN), based on their shared-weights architecture and translation invariance characteristics.

Financial-News-for-Stock-Prediction-using-DP-LSTM-NIPS-2019 - Differential Privacy-inspired LSTM for Stock Prediction Using Financial News

  •    Python

Under code file 7_DP_LSTM is the main file for the DP-LSTM deep neural network. The S&P 500 stocks are in the data folder.

Quantifying-ESG-Alpha-using-Scholar-Big-Data-ICAIF-2020 - Quantifying ESG Alpha using Scholar Big Data: An Automated Machine Learning Approach

  •    Jupyter

ESG (Environmental, social, governance) factors are widely known as the three primary factors in measuring the sustainability and societal impacts of an investment in a company or business. This repoitory proposes a quantitative approach to measuring the ESG premium in stock trading using ESG scholar data. The alternative data we use is from the Microsoft Academic Graph database, which is an open resource database with records of publications, including papers, journals, conferences, books etc. It provides the demographics of the publications like public date, citations, authors and affiliated institutes. It includes ESG publication records dating back to 1970s - long enough to study the relationship between ESG publications and companies' stock prices.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.