-
Long short term memory. Seq2Seq in machine translation.
Tags:
RNN |tricks Long short term memory architecture for working with sequences. Sequence to sequence model for example machine translation task and trick for faster training of decoder.
-
Word2Vec. CBOW and Skip gram.
Tags:
NLP |embedding Method for getting simple contextual but interpretable word embeddings in which may conduct arithmetical operations and find distance between them.
-
U-Net. Couple of tricks for image segmentation.
Tags:
CV |segmentation |tricks FCN-similar network for image segmentation and also a few tricks which may improve segmentation and other CV solutions.
-
Fully convolutional networks based models.
Tags:
CV |segmentation Article essence for FCN in which im introduced in segmentation and explain the fully convolutional networks needed for segmentation tasks.
-
Improve ResNet. ResNeXt and SEResNeXt architectures.
Tags:
CV |feature extractor New powerful blocks and techniques for ResNet which improve score and decrease count of parameters. New type of convolution.
-
ResNet architecture. Residual blocks.
Tags:
CV |feature extractor It's pilot note. Simple and not bad image feature extractor network with low count of parameters that realise powerful idea of skip connections.