This project provides a tool for measuring energy consumption across multiple commits in a Git repository. It automates the process of:
- Cloning the specified repository.
- Checking out each commit sequentially.
- Running a given test command multiple times.
- Measuring energy consumption using a Bash script.
- Detecting significant energy regressions.
- Python 3
- Git
- Bash
- Intel RAPL enabled (for power measurement)
- Docker (optional, for running in a container)
Before running the pipeline, make sure you have:
- installed the pre-requisites
- ran system_setup.sh to configure the system for energy measurement
- running inside a new tty, without your graphical desktop environment running
- disabled as many processes as possible to avoid interfering with the measurement
- if on laptop, that your laptop is plugged, with the battery fully charged, features like auto-brightness off
Run the system_setup.sh
script to configure the system for energy measurement.
You will have to run it two times if you never ran it.
The first time you'll need to configure your system, this setup only needs to be done one time.
sudo system_setup.sh first-setup
Then reboot your system.
The second time you'll need to configure your system, this setup needs to be done every time.
sudo system_setup.sh setup
As this will put your system into a mode that is suitable for running the pipeline but not for daily use of your computer you can revert the settings in a best effort mode using:
sudo system_setup.sh revert-first-setup
then reboot your system.
you can also revert the setup parameters using :
sudo system_setup.sh revert-setup
which doesn't require a reboot.
-
Clone the repository
-
Install dependencies:
pip install -r requirements.txt
-
Create a json config file for the project you want to analyze
The config file must follow the schema defined in config.schema.json
-
Check that the system is stable
python main.py stability-test
-
Run the pipeline
python main.py measure <config_path>
-
Results
Results will be produced in a csv file in the same directory as your config file.
-
Analyze results
Run the plot.py script to visualize the results
python plot.py <csv_file>
Follow steps 1 to 3 (included) without docker.
before step 4, run:
```sh
docker buildx build -t pipeline .
docker run -it --privileged pipeline:latest
```
-
Create a tmux session
tmux new -s mysession
-
Reconnect using
tmux attach -t mysession
- [] Don't erase produced csv files, use a timestamp
- [] Do a warmup run before the actual measurement
- [] Add a cooldown between measurements, 1 second by default
- [] Save run conditions (temperature, cpu governor, number of cpu cycles, etc.), perf could be use for part of this.
- [] Save run conditions and config file at top of the csv file, so we have all the informations in one place, or make a special file type for this.