Main Article Content
Convolutional with attention gated recurrent network for sentiment analysis
Abstract
In recent years, deep learning approaches like convolutional neural networks (CNNs) and recurrent neural networks (RNNs) have received much attention to natural language processing tasks, especially to sentiment analysis. Different methods will be used to measure the Convolutional with Attention Gated Recurrent Network for Sentiment Analysis. Thankfully, these methods achieved significant results. However, these approaches individually fail to accomplish the task of sentiment analysis at the extent level. In sentiment analysis, the likelihood of a given word is estimated based on long-term dependencies and local contextual features that depend on a word and its neighboring words. This paper suggests a Convolutional with Attention Gated Recurrent Network (CAGRN) model performs the sentiment analysis by extracting these features. The objective behind our model is to apply the CNN layer to extract local contextual features. Afterward, the CAGRNusesa bidirectional gated recurrent unit (Bi-GRU) layer to encode the long-term dependence features. On the other hand, the attention mechanism is applied to help our model select the convenient words that hold sentiment information. The CAGRN performs better in sentimentIn recent years, deep learning approaches like convolutional neural networks (CNNs) and recurrent neural networks (RNNs) have received much attention to natural language processing tasks, especially to sentiment analysis. Different methods will be used to measure the Convolutional with Attention Gated Recurrent Network for Sentiment Analysis. Thankfully, these methods achieved significant results. However, these approaches individually fail to accomplish the task of sentiment analysis at the extent level. In sentiment analysis, the likelihood of a given word is estimated based on long-term dependencies and local contextual features that depend on a word and its neighboring words. This paper suggests a Convolutional with Attention Gated Recurrent Network (CAGRN) model performs the sentiment analysis by extracting these features. The objective behind our model is to apply the CNN layer to extract local contextual features. Afterward, the CAGRNusesa bidirectional gated recurrent unit (Bi- GRU) layer to encode the long-term dependence features. On the other hand, the attention mechanism is applied to help our model select the convenient words that hold sentiment information. The CAGRN performs better in sentiment analysis by using the learned features. Our approach achieves competitive results on two real datasets IMDb and SSTb, compared with baseline models and requires fewer parameters. Executing various ablation experiments of our model components will be done in future.