Automated generation of programs for a class of parametric neuroevolution algorithms

A.Yu. Doroshenko, I.Z. Achour

Abstract


The facilities of algebra of hyperschemes are applied for automated generation of neuroevolution algorithms on an example of a binary multiplexer evaluation problem, which is a part of the SharpNEAT system. SharpNEAT is an open-source framework developed in C# programming language, which implements a genetic neuroevolution algorithm for the .NET platform. Neuroevolution is a form of artificial intelligence, which uses evolution algorithms for creating neural networks, parameters, topology, and rules. Evolution algorithms apply mutation, recombination, and selection mechanisms for finding neural networks with behavior that satisfies to conditions of some formally defined problem. In this paper, we demonstrate the use of algebra of algorithms and hyperschemes for the automated generation of evaluation programs for neuroevolution problems. Hyperscheme is a high-level parameterized specification of an algorithm for solving some class of problems. Setting the values of the hyperscheme parameters and further interpretation of a hyperscheme allows obtaining algorithms adapted to specific conditions of their use. Automated construction of hyperschemes and generation of algorithms based on them is implemented in the developed integrated toolkit for design and synthesis of programs. The design of algorithms is based on Glushkov systems of algorithmic algebra. The schemes are built using a dialogue constructor of syntactically correct programs, which consists in descending design of algorithms by detailing the constructions of algorithmic language. The design is represented as an algorithm tree. Based on algorithm schemes, programs in a target programming language are generated. The results of the experiment consisting in executing the generated binary multiplexer evaluating program on a cloud platform are given.

Prombles in programming 2022; 3-4: 301-310


Keywords


automated program design; algebra of algorithms; hyperscheme; neuroevolution; neural network; parallel and distributed computing; cloud computing

References


STANLEY, K. O., CLUNE, J., LEHMAN, J. & MIIKKULAINEN, R. (2019) Designing neural networks through neuroevolution. Nature Machine Intelligence. 1. p. 24-35. CrossRef

SharpNEAT - Evolution of Neural Networks. [Online] Available from: https://github.com/colgreen/sharpneat [Accessed 12/08/ 2022]

ACHOUR, I. Z. & DOROSHENKO, A. YU. (2021) Distributed implementation of neuroevolution of augmenting topologies method. Problems in programming. [Online] (3). p. 3-15. (in Ukrainian). Available from: http://pp.isofts.kiev.ua/ojs1/article/view/467 [Accessed 12/08/2022]

DOROSHENKO, A. & YATSENKO, O. (2021) Formal and adaptive methods for automation of parallel programs construction: emerging research and opportunities. Hershey: IGI Global. CrossRef

ANDON, P. I. et al. (2018) Algebra-Algorithmic Models and Methods of Parallel Programming. Kyiv: Akademperiodyka. CrossRef

YATSENKO, O. On parameter-driven generation of algorithm schemes. (2012) Proc. Int. Workshop "Concurrency, Specification, and Pro- gramming", CS&P'2012, Berlin, Germany (26-28 September 2012). [Online] Berlin: Humboldt University. p. 428-438. Available from: http://ceur-ws.org/Vol-928/0428.pdf [Accessed 12/08/2022]

YUSHCHENKO, K. L., TSEITLIN, G. O. & GALUSHKA, A. V. (1989) Algebra-algorithmic specifications and synthesis of structured schemes of programs. Cybernetics. (6). p. 5-16. (in Russian). CrossRef

DOROSHENKO, A. & SHEVCHENKO, R. (2006) A rewriting framework for rule-based programming dynamic applications. Fundamenta Informaticae. [Online] 72 (1-3). p. 95-108. Available from: https://www.researchgate.net/publication/250731334 [Accessed 12/08/2022]

DOROSHENKO, A. et al. (2019) A mixed method of parallel software auto-tuning using statistical modeling and machine learning. Communications in Computer and Information Science. Information and Communication Technologies in Education, Research, and Industrial Applications. 1007. p. 102-123. CrossRef




DOI: https://doi.org/10.15407/pp2022.03-04.301

Refbacks

  • There are currently no refbacks.