Deep learning models uses many parameters to work properly. As
they become more complex, the authors of these novel models cannot
explore in their papers the variation of each parameter of their
model. Therefore, this work describes an analysis of the impact of
four different parameters (Early Stopping, Learning Rate, Dropout,
and Hidden 1) in the TextGCN Model. This evaluation used four
datasets considered in the original TextGCN publication, obtaining
as a side-effect small improvements in the results of three of them.
The most relevant conclusion is that these parameters influence the
convergence and accuracy, although they individually do not constitute
strong support when aiming to improve the model’s results
reported as the state-of-the-art.
O Computer on the Beach é um evento técnico-científico que visa reunir profissionais, pesquisadores e acadêmicos da área de Computação, a fim de discutir as tendências de pesquisa e mercado da computação em suas mais diversas áreas.