Scenario / Testbed
The assignment will be graded based on the groups’ ability to produce useful and correct information within the boundaries of the given time and resources.
For the experiments, we will use Amazon VMs as video servers and MONROE nodes as video clients.
Amazon VMs as Video Servers:
We will provide each group with a virtual machine hosted at Amazon EC2 server farms.
NOTE: You do NOT need to create a bottleneck on the receiver machine, limiting incoming traffic to the target capacity as in the first assignment. The video file we are using for the streaming has a maximum rate of 4Mbps at the highest quality.
MONROE Nodes as Clients:
You will use MONROE nodes as video clients to mimic mobile users. For more information on MONROE and how to run experiments on it, please refer to the user manual: https://github.com/MONROE-PROJECT/UserManual
Below, is a brief summary of how to run an experiments on MONROE nodes:
-
Go to the User Interface: https://www.monroe-system.eu/
-
For this, you need to have a certificate
-
Install the certificate in your browser first, then go the actual webpage listed above
-
-
With that, you can see in the “New” tab the interface with the MONROE Scheduler
-
Use this to access the nodes and schedule your experiment
-
Deploying a new experiment
-
Name – can be whatever you want, but make sure you can identify it
-
Script – provide the name of your container
-
Number of nodes - choose how many nodes you will use
-
Explicit Node IDs: Choose among the available nodes
-
Additional options: Any input parameter needed for your experiment
-
Tick “As soon as possible”, check the availability and then deploy experiment!
-
-
Retrieving results
-
Go to the User Interface: https://www.monroe-system.eu/
-
From there, check the “Status” tab
-
Look for the experiment with the name you chose before, click on it and go to the “download” tab
-
Example output: https://www.monroe-system.eu/user/213781/
-
-
Script for running the experiments in MONROE, without the user interface: https://github.com/ana-cc/monroe-cli
For this assignment, the main purpose of the measurement is to measure the video streaming performance over mobile networks. We will use AStream for DASH video streaming: https://github.com/pari685/AStream. To get you up and running fast, you are provided a AStream container to run on MONROE: https://hub.docker.com/r/andralutu/astream/. To evaluate / analyse the user’s quality of experience, you are expected to use OPVQ: https://bitbucket.org/mpg_code/openvq.
You may choose the tools for the analysis of the results based on what you are familiar with and what you deem most appropriate for the purpose. The important aspect is that you are able to produce a complete and clear report at the end of the assignment period.
For the experiment:
-
Make sure that the DASH AStream server is running on your Amazon VM.
-
On the MONROE nodes, use the following entries:
-
Script – andralutu/astream
-
Number of nodes - choose how many nodes you will use
-
Explicit Node IDs: Choose among 186,187,450,451,452,453 (Only use these nodes!!!)
-
Additional options:
-
Provide where the DASH server is running: "mpd_file":http://<public_ip>:<port>/BigBuckBunny_4s.mpd
-
Change the number of segments and the playback algorithm: "segment_limit":3,"playback":"NETFLIX”
-
For the results to be comparable, we allocated the following nodes:
-
186 with CAT3 modems (Telia and Telenor)
-
450, 452 with CAT6 modems (Telia and Telenor)
-
451, 453 (ICE)
We encourage you to discuss the challenges and techniques across groups to reduce the overhead in attaining a new field of knowledge. Copying of code, scripts or experimental results, however, will be counted as cheating.
Report
You must write up the results as a technical report of no more than 4 pages in ACM format. It is expected that such a report includes the core elements presented in the lectures under “A systematic approach to performance evaluation”.The results must be based on your own experiments and your own data.
The report is evaluated by writing quality, clarity of presentation, by the trustworthiness and correctness of the results. The evaluation does not consider whether related work (citations of other papers) is included.
Evaluation details
In our evaluation of the reports, we will focus on the following elements:
-
Choice of metrics, workloads, system configuration parameters and methodology for the experiments
-
Use of statistical sound methods when analysing the data
-
Disposition of the available time (ability to collect and present useful information within the boundaries of the available resources)
-
Objectivity in defining the work, choosing metrics and workloads, in the analysis and in presenting the results
-
Transparency of reporting (exposure of assumptions and limitations to the reader)
-
Clarity of presentation
Bonus elements:
-
Analysis of different quality metrics such as PSNR and SSIM
-
Analysis of the video performance when roaming (use node 187 with Telia Sweden subscription)
-
In depth analysis of video quality analysis together with metadata (e.g. RTTs, signal strength, etc…)
Formalities
The deadline for handing in your assignment is: Tuesday, June 6th at (15:00:00).
Deliver your report (as PDF) at https://devilry.ifi.uio.no/.
The groups should also prepare a poster (2 x A3 pages) and a quick talk (max 5 minutes without slides) where you pitch your poster for the class on June 8th. Name the poster with your group name, and e-mail the poster by email to inf5072@ifi.uio.no no later than noon (12:00) on June 7th. We will then print the poster for you.
For questions and course related chatter, we have created a Slack space:
https://mpglab.slack.com/messages/inf5072/
There will be a prize for best poster/presentation (awarded by an independent panel and independent of the grade).
For questions please contact: inf5072@ifi.uio.no