Implementation of deep models with arbitrary differentiable transformations

Abstract

Modeli dubokog ucenja se iz dana u dan razvijaju, kao što i ljudi u ˇ ce iz dana u dan, ˇ tako i racunala simuliraju ljudski mozak. ˇ Povratne neuronske mreže uce sekvencijalne podatke kojima je poredak bitan, te na ˇ temelju njih daju predikcije. Osnovni model jest obicni povratni model, RNN, koji kroz svoje ˇ celije prenosi in- ´ formacije buducim´ celijama o položaju i vrijednostima svih podataka. Tu se javlja ´ problem eksplodirajuceg i nestaju ´ ceg gradijenta, stoga se u praksi ne koristi ´ cesto. ˇ Celija s dugoro ´ cnom memorijom LSTM, nadogradnja je RNN modela, koja ima du- ˇ gorocno pam ˇ cenje, gdje svaka ´ celija filtrira i ostavlja bitne informacije. ´ U slucaju vlastite implementacije jednih od ovakvih ˇ celija, mogu se koristiti ru ´ cno ra- ˇ dene ¯ celije koja ´ ce imati dovoljno druga ´ cija svojstva od uobi ˇ cajenih dobro poznatih ˇ celija. ´ Celije se mogu lagano napisati u Pythonu korištenjem alata PyTorch, no želimo ´ li optimizirati dijelove koda možemo koristiti rucno pisane ekstenzije u C++ koje lako ˇ možemo povezati sa Pythonom te koristiti kao nativne metode Python biblioteka. No, Python biblioteke su iznimno dobro optimizirane, stoga pisanje takvih ekstenzija ima smisla raditi samo kada je velika kolicina posla u pitanju te ako možemo napisati ˇ kvalitetan i optimiziran kod.Deep learning models are developing day by day, just as people learn day by day, so computers simulate the human brain. Recurrent neural networks learn sequential data in which the order is important, and based on them make predictions. The basic model is an recurrent neural network model, RNN, which through its cells transmits information to future cells about the position and values of all it’s data. This is where the problem of exploding and disappearing gradients arises, so it is not often used in practice. Cell with long short-term memory LSTM, is an upgrade of the RNN model, which has a long-term memory, where each cell filters and leaves essential information. In the case of your own implementation of one of these cells, you can use hand-made cells that will have sufficiently different properties from the usual well-known cells. Cells can be easily written in Python using the PyTorch tool, but if we want to optimize parts of the code, we can use hand-written extensions in C++ that we can easily connect to Python and use as native methods of Python libraries. However, Python libraries are extremely well optimized, so writing such extensions makes sense only when a large amount of work is involved and if we can write highquality and optimized code

    Similar works

    Full text

    thumbnail-image

    Available Versions