Reference Hub6
Tiny-UKSIE: An Optimized Lightweight Semantic Inference Engine for Reasoning Uncertain Knowledge

Tiny-UKSIE: An Optimized Lightweight Semantic Inference Engine for Reasoning Uncertain Knowledge

Daoqu Geng, Haiyang Li, Chang Liu
Copyright: © 2022 |Volume: 18 |Issue: 1 |Pages: 23
ISSN: 1552-6283|EISSN: 1552-6291|EISBN13: 9781799893967|DOI: 10.4018/IJSWIS.300826
Cite Article Cite Article

MLA

Geng, Daoqu, et al. "Tiny-UKSIE: An Optimized Lightweight Semantic Inference Engine for Reasoning Uncertain Knowledge." IJSWIS vol.18, no.1 2022: pp.1-23. http://doi.org/10.4018/IJSWIS.300826

APA

Geng, D., Li, H., & Liu, C. (2022). Tiny-UKSIE: An Optimized Lightweight Semantic Inference Engine for Reasoning Uncertain Knowledge. International Journal on Semantic Web and Information Systems (IJSWIS), 18(1), 1-23. http://doi.org/10.4018/IJSWIS.300826

Chicago

Geng, Daoqu, Haiyang Li, and Chang Liu. "Tiny-UKSIE: An Optimized Lightweight Semantic Inference Engine for Reasoning Uncertain Knowledge," International Journal on Semantic Web and Information Systems (IJSWIS) 18, no.1: 1-23. http://doi.org/10.4018/IJSWIS.300826

Export Reference

Mendeley
Favorite Full-Issue Download

Abstract

The application of semantic web technologies such as semantic inference to the field of the internet of things (IoT) can realize data semantic information enhancement and semantic knowledge discovery, which plays a key role in enhancing data value and application intelligence. However, mainstream semantic inference engines cannot be applied to IoT computing devices with limited storage resources and weak computing power and cannot reason about uncertain knowledge. To solve this problem, the authors propose a lightweight semantic inference engine, Tiny-UKSIE, based on the RETE algorithm. The genetic algorithm (GA) is adopted to optimize the Alpha network sequence, and the inference time can be reduced by 8.73% before and after optimization. Moreover, a four-tuple knowledge representation method with probability factors is proposed, and probabilistic inference rules are constructed to enable the inference engine to infer uncertain knowledge. Compared with mainstream inference engines, storage resource usage is reduced by up to 97.37%, and inference time is reduced by up to 24.55%.