
< KAIST Professor Hyun Wook Ka >
KAIST announced on the 13th that a research team led by Professor Hyun Wook Ka of the Assistive AI Lab within the Department of Transdisciplinary Studies has developed 'K-Braille,' a next-generation Braille translation engine that advances the technology of 'Braille translation'—converting standard text (mukja) into Braille that the visually impaired can read—and has completed large-scale performance verification. Braille translation is the process of converting information written in standard text, such as books, documents, and web pages, into the appropriate Braille system, and it is an essential technology for the information accessibility of the visually impaired. However, Korean Braille regulations include various exception rules for spacing, symbols, and foreign language notations, making accurate automatic Braille translation difficult. Current Braille programs used by the visually impaired employ a method of converting characters or symbols based on simple rules, which often leads to errors in complex regulation processing, such as mixed expressions of multilingual (English, etc.) and Korean, compound unit symbols, or spacing within parentheses. Since an error in a single Braille cell can lead beyond a simple typo to information distortion for the visually impaired, the importance of accurate Braille translation technology has been consistently raised. The most significant feature of the K-Braille engine developed by the research team is that it is a 'system that understands sentences.' While existing Braille programs use a substitution method that simply changes characters or symbols, K-Braille is a technology that analyzes the structure and context of a sentence through morphological analysis and Abstract Syntax Tree (AST) analysis to understand the meaning before converting it into Braille. Through this, it can more accurately handle various exceptional situations in the revised Braille regulations, such as sentences with mixed foreign languages and Korean, complex symbol combinations, and unit notations. To verify the accuracy of the technology, the research team utilized the 'NLPAK (Standard Text-Braille Parallel Corpus),' the largest Braille dataset in Korea established by the National Institute of Korean Language. This data contains pairs of standard text and Braille sentences, and the research team extracted 17,943 sentences from it to conduct a full evaluation of how closely the K-Braille translation results matched the actual Braille. The results showed that the 'True Adjusted Accuracy,' which indicates how accurately the Braille regulations are actually followed, was 100.0%, and the morphological structure similarity of Braille sentences, showing how similar the structure is to the correct answer, recorded an average of 99.81%, confirming high translation accuracy. Furthermore, in a comparative verification using the same sentence set with the National Institute of Korean Language's official Braille program, 'Jumsarang 6.3.5.8,' K-Braille showed a higher translation matching rate, confirming its technical competitiveness.


< Semantic Braille Translation Architecture Diagram Based on the AST Structure of the K-Braille Engine >
Professor Hyun Wook Ka (KAIST), a researcher with congenital severe visual impairment who led this study and is the advisor to Inseo Chung (28), a student in the Department of Transdisciplinary Studies and CEO of the startup MPAG who donated 1 billion KRW to KAIST on the 10th to foster 'Inclusive AI' talent, stated, "Braille is not just a symbol for the visually impaired, but a language for reading the world." He added, "Based on this achievement, we plan to develop the technology into a next-generation Braille system that can handle mathematical formulas, scientific symbols, and even musical scores in the future." He continued, "I hope this technology will further enhance the information accessibility of the visually impaired and serve as an opportunity to present a new technical standard in the field of Korean Braille translation artificial intelligence." The research team plans to go beyond the limitations of the existing Braille file format (.brf) to create a new Braille file format and build an ecosystem for the next-generation electronic Braille file format '.brfx (Braille File eXtended),' along with developing the software and device environments for writing, reading, and sharing these files. In particular, the research team plans to return the K-Braille engine to society entirely free of charge as an 'Inclusive AI' technology. However, to prevent technological fragmentation and maintain a sustainable ecosystem, rather than indiscriminate software open-sourcing, they plan to establish official technology transfer and partnership networks with 'responsible technology utilization entities' such as public institutions, offices of education, Braille libraries, and assistive device manufacturers. By promoting this within the year, they aim to enable institutions currently building or operating Braille environments and new Braille display companies to immediately integrate the most perfect 2024-standard latest Braille translation module (API and system kernel) without any additional software license costs. Ultimately, the core value is to provide the highest level of barrier-free information accessibility to the final stage of visually impaired users without passing on any costs.