Reproducing the results from paper "Evaluating QUIC Performance Over Web, Cloud Storage, and Video Workloads"
- Member 1: Dharmik Patel
- Member 2: Ajinkya Narwarkar
- Member 3: Nishtha Aggarwal
- Member 4: Jevin Modi
- Member 5: Soham Sharma
- Member 6: Arnav Gupta
This project aims to reproduce the results from Evaluating QUIC Performance Over Web, Cloud Storage, and Video Workloads. The experiments will assess QUIC’s performance across web, cloud storage, and video workloads, comparing it to TLS/TCP. Below is the task breakdown.
- Set up repositories: Create GitHub repository, structure folders for
scripts,data,results, anddocumentation.- Assigned to: Member 1
- Install dependencies: Set up required libraries (
lsquic,libcurl, etc.) and document setup instructions inREADME.md.- Assigned to: Member 2
- Network Configuration: Ensure setup for both low-latency and high-latency networks. Use emulation tools if necessary.
- Assigned to: Member 3 and Member 4
-
Web Workloads Experiment (quic_perf and tls_perf)
- Configure
quic_perfandtls_perfto measure connection times, TTFB, and download times for selected websites. - Create sample data collection script to test configurations.
- Assigned to: Member 1 and Member 2
- Configure
-
Cloud Storage Workloads Experiment
- Configure file download experiments over Google Drive with
quic_perfandtls_perf. - Set up file size variations and script to automatically record throughput and CPU usage.
- Assigned to: Member 3 and Member 4
- Configure file download experiments over Google Drive with
-
Video Workloads Experiment (video_download and video_streaming)
- Configure
video_downloadtest to measure connection times, throughput, and stall events. - Configure
video_streamingwith adaptive streaming to record QoE metrics (e.g., startup delay, stalls, quality switches). - Assigned to: Member 5 and Member 6
- Configure
- Automated Data Collection: Create scripts to automate data collection and ensure periodic execution (e.g., every 3 hours).
- Assigned to: Member 1 and Member 2
- Network Loss Simulation: Implement packet loss simulation using
tcutility for video streaming tests and log results for each loss level.- Assigned to: Member 3 and Member 5
- Log Processing: Write scripts to clean and format log files for further analysis.
- Assigned to: Member 6
- Statistical Analysis: Perform CDF analysis on connection times, TTFB, download times, and stall durations.
- Assigned to: Member 4 and Member 5
- Visualization: Create CDF plots, throughput comparison charts, and stall duration graphs.
- Assigned to: Member 3 and Member 6
- Flame Graphs for CPU Usage: Generate flame graphs for CPU utilization in different workloads.
- Assigned to: Member 2 and Member 5
- Experiment Guide: Document steps for each experiment, including commands and configurations.
- Assigned to: Member 1
- Data Analysis Report: Summarize findings for each workload with graphs and statistical insights.
- Assigned to: Member 6
- Project Report: Create a final report summarizing methodology, results, and conclusions.
- Assigned to: Member 5
- Review Code and Documentation: Conduct a team review session to finalize code and documentation.
- Assigned to: All Members
- Submit Results and Code: Package results, code, and documentation for submission.
- Assigned to: Member 1
- Automate Kernel Version Check: Verify if the latest kernel updates support QUIC enhancements.
- Assigned to: Member 4
- Explore Additional QUIC Versions: If time permits, test additional versions of QUIC for extended results.
- Assigned to: Member 2 and Member 3
Note: Each member should update their task status on GitHub to reflect progress.
Github reference link - https://github.com/gongzhenmu/Quic-performance-test
Generate public and private key pair: ssh-keygen -t rsa -b 2048. Will be stored in /.ssh folder.
Get into EC2: ssh -i "~/.ssh/id_rsa" ubuntu@35.153.241.159. Add private key in double quotes