Advanced Machine Learning Approaches for Prediction of Compressive Strength of Sustainable Concrete
Karthik Sai Repala1, Dr. Kusuma Sundara Kumar2
1UG Scholar, Department of CSE-AIML, 2Professor, Department of Civil Engineering,
Ramachandra College of Engineering-|Eluru, Andhra Pradesh, India
1karthiksai22102005@gmail.com, 2skkusuma123@gmail.com
ABSTRACT
Predicting the compressive strength of concrete using SCMs is difficult since the relationships between the mix components are not linear. The fact that these parts interact with one another in complex ways is one explanation. In order to address this, five distinct kinds of math-based tools were evaluated. Random Forest, which aggregates findings, was one of these methods. Gradient boosting is an alternative method that iteratively improves upon previous mistakes. Next up is XGBoost, a popular choice for situations when speed is crucial. As an alternative to making a sudden leap, AdaBoost refines its predictions incrementally. The Multiple Linear Regression method, on the other hand, is exclusive to linear patterns. How was the data utilized? A database containing 1030 distinct recipe mixtures, with 80% taught and 20% tested. Performance wasn't evaluated based on intuition but rather four metrics: R squared demonstrated fit, root mean square error highlighted the significance of large mistakes, mean absolute error quantified the average amount of errors, and percentage deviation demonstrated the extent to which estimations fell off. The reliability and consistency of the predictions were shown numerically. Out of all the techniques that were tested, boosting emerged as the clear winner, easily outperforming bagging and conventional linear models. With a test R squared value of 0.927, XGBoost achieved the best score. How does SCM concrete react under pressure? Tangled designs are much easier to handle with these tools. When the issue fights back fiercely, a steady, step-by-step improvement might emerge victorious.
Keywords – ML Models, SCM concrete, Random Forest, Gradient Boosting, XGBoost