Displaying 1 to 16 from 16 results

Skater is a unified framework to enable Model Interpretation for all forms of model to help one build an Interpretable machine learning system often needed for real world use-cases(** we are actively working towards to enabling faithful interpretability for all forms models). It is an open source python library designed to demystify the learned structures of a black box model both globally(inference on the basis of a complete data set) and locally(inference about an individual prediction). The project was started as a research idea to find ways to enable better interpretability(preferably human interpretability) to predictive "black boxes" both for researchers and practioners. The project is still in beta phase.

ml predictive-modeling machine-learning modeling-tools model-interpretation blackbox datascience model-explanation explanation-system deep-learning deep-neural-networks attribution lstm-neural-networks cnn-classificationIn this work, we demonstrate a strong baseline two-stream ConvNet using ResNet-101. We use this baseline to thoroughly examine the use of both RNNs and Temporal-ConvNets for extracting spatiotemporal information. Building upon our experimental results, we then propose and investigate two different networks to further integrate spatiotemporal information: 1) temporal segment RNN and 2) Inception-style Temporal-ConvNet. Our analysis identifies specific limitations for each method that could form the basis of future work. Our experimental results on UCF101 and HMDB51 datasets achieve state-of-the-art performances, 94.1% and 69.0%, respectively, without requiring extensive temporal augmentation.

activity-recognition video-understanding torch lstm-neural-networks convolutional-neural-networksTo examine a number of different forecasting techniques to predict future stock returns based on past returns and numerical news indicators to construct a portfolio of multiple stocks in order to diversify the risk. We do this by applying supervised learning methods for stock price forecasting by interpreting the seemingly chaotic market data. Download the Dataset needed for running the code from here.

machine-learning supervised-learning stock-price-forecasting forecasting rnn lstm lstm-neural-networks video concept-video analysisThis is a small notebook that I wrote to help me understand how batching was done in PyTorch with an Recurrent Neural Network (LSTM). Please, if you see anything wrong within this notebook feel free to contribute or submit an issue, I may have misunderstood/misinterpreted/misrepresented some things here.

pytorch deep-learning lstm-neural-networks names-classificationDuring the time that I was writing my bachelor's thesis Sequence-to-Sequence Learning of Financial Time Series in Algorithmic Trading (in which I used LSTM-based RNNs for modeling the thesis problem), I became interested in natural language processing. After reading Andrej Karpathy's blog post titled The Unreasonable Effectiveness of Recurrent Neural Networks, I decided to give text generation using LSTMs for NLP a go. Although slightly trivial, the project still comprises an interesting program and demo, and gives really interesting (and sometimes very funny) results. I implemented the program over the course of a weekend in Hy (a LISP built on top of Python) using Keras and TensorFlow. You can train the model on any text sources you like. Remember to give it enough time to go over at least fifty epochs, otherwise the generated text will not be very interesting, rather seemingly random garbage.

lstm lstm-neural-networks rnn tensorflow tensorflow-experiments keras text-generation natural-language-processing nlp-machine-learning machine-learning lisp hylang keras-neural-networks artificial-intelligence artificial-neural-networks recurrent-neural-networksThis is my bachelor's thesis that I wrote over the course of two months during my final year of studies, earning my Bachelor of Science in Computer Science degree. The thesis was co-authored by my good friend Tobias Ånhed. Click here for revised edition on DiVA.

lstm-neural-networks research-paper bachelor-thesis sequence-to-sequence machine-learning finance trading forex algorithmic-trading recurrent-neural-networks forex-trading technical-analysis technical-indicators artificial-neural-networks keras time-series-analysis financial-analysis white-paper publication trading-algorithmspytorch-kaldi is a public repository for developing state-of-the-art DNN/RNN hybrid speech recognition systems. The DNN part is managed by pytorch, while feature extraction, label computation, and decoding are performed with the kaldi toolkit. The provided solution is designed for large-scale speech recognition experiments on both standard machines and HPC clusters.

speech-recognition gru dnn kaldi rnn-model pytorch timit deep-learning deep-neural-networks recurrent-neural-networks multilayer-perceptron-network lstm lstm-neural-networks speech asr rnn dnn-hmmGenerate monophonic melodies using a basic LSTM RNN. Great for machine learning MIDI generation baselines. For more info, check out our blog post about the project. Made using Keras. First create a folder of MIDI files that you would like to train your model with. I've included ~130 files from the Lakh MIDI Dataset inside data/midi that you can use to get started. Note that is basic RNN learns only from the monophonic tracks in MIDI files and simply ignores tracks that are observed to include polyphony.

computer-music machine-learning ml4a neural-network lstm-neural-networks rnn keras midiWelcome to my GitHub repo. I am a Data Scientist and I code in R, Python and Wolfram Mathematica. Here you will find some Machine Learning, Deep Learning, Natural Language Processing and Artificial Intelligence models I developed.

r python3 python-3 mathematica lasagne theano theano-models autoencoder face-recognition natural-language-processing nlp nlp-machine-learning deep-learning keras lstm lstm-neural-networks timeseries time-series-analysis word2vecThis repository contains the code used in my master thesis on LSTM based anomaly detection for time series data. The thesis report can be downloaded from here. We explore the use of Long short-term memory (LSTM) for anomaly detection in temporal data. Due to the challenges in obtaining labeled anomaly datasets, an unsupervised approach is employed. We train recurrent neural networks (RNNs) with LSTM units to learn the normal time series patterns and predict future values. The resulting prediction errors are modeled to give anomaly scores. We investigate different ways of maintaining LSTM state, and the effect of using a fixed number of time steps on LSTM prediction and detection performance. LSTMs are also compared to feed-forward neural networks with fixed size time windows over inputs. Our experiments, with three real-world datasets, show that while LSTM RNNs are suitable for general purpose time series modeling and anomaly detection, maintaining LSTM state is crucial for getting desired results. Moreover, LSTMs may not be required at all for simple time series.

lstm anomaly-detection bayesian-optimization time-series recurrent-neural-networks deep-learning lstm-neural-networks neural-networksBitcoin price Prediction ( Time Series ) using LSTM Recurrent neural network

lstm-neural-networks time-series-analysis bitcoin-price-prediction recurrent-neural-networks deep-neural-networks deep-learning-tutorial deep-learning series lstm rnn keras tensorflowGenerate song lyrics using LSTM Recurrent neural network

lstm-neural-networks recurrent-neural-networks song-lyrics-generator keras tensorflow deep-learning machine-learning deep-learning-tutorialIn machine learning, a convolutional neural network (CNN, or ConvNet) is a class of deep, feed-forward artificial neural networks that has successfully been applied to analyzing visual imagery. CNNs use a variation of multilayer perceptrons designed to require minimal preprocessing. They are also known as shift invariant or space invariant artificial neural networks (SIANN), based on their shared-weights architecture and translation invariance characteristics.

neural-networks convolutional-neural-networks scratch jupyter-notebook cnn handwritten-digit-recognition iris-dataset convolutional-layers derivatives forward-propagation backward-propagation numpy matplotlib lstm-neural-networks rnn backward-propagation-through-time lstm-cells lstm-networks deep-learningYou have just found TensorLayer! High performance DL and RL library for industry and academic. Contributions welcome! Read the contribution guidelines first.

tensorlayer tensorflow tensorflow-tutorials natural-language-processing reinforcement-learning adversarial-learning autoencoder database computer-vision keras tf-slim tflearn horovod lstm-neural-networks convolutional-neural-networks recurrent-neural-networks segmentation cifar-10 mnist generative-adversarial-networkUnder code file 7_DP_LSTM is the main file for the DP-LSTM deep neural network. The S&P 500 stocks are in the data folder.

nlp-machine-learning lstm-neural-networks financial-newsESG (Environmental, social, governance) factors are widely known as the three primary factors in measuring the sustainability and societal impacts of an investment in a company or business. This repoitory proposes a quantitative approach to measuring the ESG premium in stock trading using ESG scholar data. The alternative data we use is from the Microsoft Academic Graph database, which is an open resource database with records of publications, including papers, journals, conferences, books etc. It provides the demographics of the publications like public date, citations, authors and affiliated institutes. It includes ESG publication records dating back to 1970s - long enough to study the relationship between ESG publications and companies' stock prices.

machine-learning lstm-neural-networks esg
We have large collection of open source products. Follow the tags from
Tag Cloud >>

Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
**Add Projects.**