RUO Home

Repositorio Institucional de la Universidad de Oviedo

View Item 
  •   RUO Home
  • Investigación
  • Datos de investigación
  • View Item
  •   RUO Home
  • Investigación
  • Datos de investigación
  • View Item
    • español
    • English
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

All of RUOCommunities and CollectionsBy Issue DateAuthorsTitlesSubjectsxmlui.ArtifactBrowser.Navigation.browse_issnAuthor profilesThis CollectionBy Issue DateAuthorsTitlesSubjectsxmlui.ArtifactBrowser.Navigation.browse_issn

My Account

LoginRegister

Statistics

View Usage Statistics

RECENTLY ADDED

Last submissions
Repository
How to publish
Resources
FAQs

Data from "An empirical evaluation of Lex/Yacc and ANTLR parser generation tools"

Author:
Ortín Soler, FranciscoUniovi authority; Quiroga Álvarez, JoséUniovi authority; Rodríguez Prieto, ÓscarUniovi authority; García Rodríguez, MiguelUniovi authority
Subject:

Parser generation

Compiler construction

Parser

Lexer

ANTLR

Lex

Yacc

Publication date:
2021-10-25
Abstract:

Parsers are used in different software development scenarios such as compiler construction, data format processing, machine-level translation, and natural language processing. Due to the widespread usage of parsers, there exist different tools aimed at automizing their generation. Two of the most common parser generation tools are the classic Lex/Yacc and ANTLR. Even though ANTLR provides more advanced features, Lex/Yacc is still the preferred choice in many university courses. There exist different qualitative comparisons of the features provided by both approaches, but no study evaluates empirical features such as language implementor productivity and tool simplicity, intuitiveness, and maintainability. In this article, we present such an empirical study by conducting an experiment with undergraduate students of a Software Engineering degree. Two random groups of students implement the same language using a different parser generator, and we statistically compare their performance with different measures. Under the context of the academic study conducted, ANTLR has shown significant differences for most of the empirical features measured.

Parsers are used in different software development scenarios such as compiler construction, data format processing, machine-level translation, and natural language processing. Due to the widespread usage of parsers, there exist different tools aimed at automizing their generation. Two of the most common parser generation tools are the classic Lex/Yacc and ANTLR. Even though ANTLR provides more advanced features, Lex/Yacc is still the preferred choice in many university courses. There exist different qualitative comparisons of the features provided by both approaches, but no study evaluates empirical features such as language implementor productivity and tool simplicity, intuitiveness, and maintainability. In this article, we present such an empirical study by conducting an experiment with undergraduate students of a Software Engineering degree. Two random groups of students implement the same language using a different parser generator, and we statistically compare their performance with different measures. Under the context of the academic study conducted, ANTLR has shown significant differences for most of the empirical features measured.

Description:

Data from the article "F. Ortin, J. Quiroga, O. Rodriguez-Prieto, M. Garcia. An empirical evaluation of Lex/Yacc and ANTLR parser generation tools. PLOS ONE 17(3), pp. 1-16, 2022. https://doi.org/10.1371/journal.pone.0264326"

URI:
https://hdl.handle.net/10651/70833
DOI:
10.17811/ruo_datasets.70833
Enlace a recurso relacionado:
http://hdl.handle.net/10651/65175
Patrocinado por:

This work has been partially funded by the Spanish Department of Science, Innovation, and Universities: project RTI2018-099235-B-I00. The authors have also received funds from the University of Oviedo through its support of official research groups (GR-2011-0040).

Collections
  • Datos de investigación [70]
  • Informática [872]
  • Investigaciones y Documentos OpenAIRE [8365]
Files in this item
untranslated
Dataset (26.52Kb)
untranslated
Readme (3.128Kb)
Métricas
Compartir
Exportar a Mendeley
Estadísticas de uso
Estadísticas de uso
Metadata
Show full item record
Página principal Uniovi

Biblioteca

Contacto

Facebook Universidad de OviedoTwitter Universidad de Oviedo
The content of the Repository, unless otherwise specified, is protected with a Creative Commons license: Attribution-Non Commercial-No Derivatives 4.0 Internacional
Creative Commons Image