Forecasting Resource Usage in Cloud Environments Using Temporal Convolutional Networks
Keywords:
Cloud Environments, Convolutional Networks, Forecasting, Resource Usage, Temporal Convolutional NetworksAbstract
Background: Predicting resource usage in cloud environments is crucial for optimizing costs. While recurrent neural networks and time series techniques are commonly used for forecasting, their limitations, such as vanishing gradients and lack of memory retention, necessitate the use of convolutional networks for modeling sequential data.
Objective: This research proposes a temporal convolutional network (TCN) to forecast CPU usage and memory consumption in cloud environments. TCNs utilize dilated convolutions to capture temporal dependencies and maintain a fixed-sized receptive field, enabling them to handle sequences of varying lengths and capture long-term dependencies. The performance of the TCN is compared with Long Short-Term Memory (LSTM) Networks, Gated Recurrent Unit (GRU) Networks, and Multilayer Perceptron (MLP).
Dataset: The study employs the Google Cluster Workload Traces 2019 data, focusing on CPU and memory utilization ranging between 5% and 95% over a 24-hour period, extracted from the first ten days.
Results: The TCN outperforms other methods in predicting both CPU usage and memory consumption. For CPU usage prediction, the TCN achieves lower error metrics, including Mean Squared Error (MSE) of 0.05, Root Mean Squared Error (RMSE) of 0.22, Mean Absolute Error (MAE) of 0.18, and Mean Absolute Percentage Error (MAPE) of 3.5%. The TCN also demonstrates higher forecast accuracy, with FA1 = 85%, FA5 = 95%, and FA10 = 98%. Similar performance improvements are observed for memory consumption prediction, with the TCN achieving lower error metrics and higher forecast accuracy compared to LSTM, GRU, and MLP. The TCN exhibits better computational efficiency in terms of training time, inference time, and memory usage.
Conclusion: The proposed temporal convolutional network (TCN) demonstrates good performance in forecasting CPU usage and memory consumption in cloud environments compared to LSTM, GRU, and MLP. Since TCN's can capture temporal dependencies and handle sequences of varying lengths makes it a promising approach for resource usage prediction and cost optimization in cloud computing.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Applied Research in Artificial Intelligence and Cloud Computing
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.