Publication:
Dynamic adaptation of HTTP-based video streaming using Markov decision process

dc.contributor.advisor Hassan, Mahbub en_US
dc.contributor.author Bokani, Ayub en_US
dc.date.accessioned 2022-03-22T11:58:22Z
dc.date.available 2022-03-22T11:58:22Z
dc.date.issued 2015 en_US
dc.description.abstract Hypertext transfer protocol (HTTP) is the fundamental mechanics supporting web browsing on the Internet. An HTTP server stores large volumes of contents and delivers specific pieces to the clients when requested. There is a recent move to use HTTP for video streaming as well, which promises seamless integration of video delivery to existing HTTP-based server platforms. This is achieved by segmenting the video into many small chunks and storing these chunks as separate files on the server. For adaptive streaming, the server stores different quality versions of the same chunk in different files to allow real-time quality adaptation of the video due to network bandwidth variation experienced by a client. For each chunk of the video, which quality version to download, therefore, becomes a major decision-making challenge for the streaming client, especially in vehicular environment with significant uncertainty in mobile bandwidth. The key objective of this thesis is to explore more advanced decision making tools that would enable an improved tradeoff between conflicting QoE metrics in vehicular environments. In particular, this thesis studies the effectiveness of Markov decision process (MDP), which is known for its ability to optimize decision making under uncertainty. The thesis makes three fundamental contributions: (1) using real video and network bandwidth datasets, it shows that MDP can reduce playback deadline miss of the video (video freezing) by up to 15 times compared to a well known non-MDP strategy when the bandwidth model is known a priori, (2) it proposes a Q-learning implementation of MDP that does not need any a priori knowledge of the bandwidth, but learns optimal decision making in a self-learning manner by simply observing the outcome of its decision making. It is demonstrated that, in terms of deadline miss, the Q-learning-based MDP outperforms the model-based MDP by a factor of three, and (3) it implements the proposed decision making framework in an Android framework and demonstrates the effectiveness of the proposed MDP-based video adaptation through real experiments. en_US
dc.identifier.uri http://hdl.handle.net/1959.4/55827
dc.language English
dc.language.iso EN en_US
dc.publisher UNSW, Sydney en_US
dc.rights CC BY-NC-ND 3.0 en_US
dc.rights.uri https://creativecommons.org/licenses/by-nc-nd/3.0/au/ en_US
dc.subject.other Dinamic Adaptive Streaming over HTTP en_US
dc.subject.other Video Streaming en_US
dc.subject.other Markov Decision Process en_US
dc.subject.other DASH en_US
dc.title Dynamic adaptation of HTTP-based video streaming using Markov decision process en_US
dc.type Thesis en_US
dcterms.accessRights open access
dcterms.rightsHolder Bokani, Ayub
dspace.entity.type Publication en_US
unsw.accessRights.uri https://purl.org/coar/access_right/c_abf2
unsw.identifier.doi https://doi.org/10.26190/unsworks/18898
unsw.relation.faculty Engineering
unsw.relation.originalPublicationAffiliation Bokani, Ayub, Computer Science & Engineering, Faculty of Engineering, UNSW en_US
unsw.relation.originalPublicationAffiliation Hassan, Mahbub, Computer Science & Engineering, Faculty of Engineering, UNSW en_US
unsw.relation.school School of Computer Science and Engineering *
unsw.thesis.degreetype PhD Doctorate en_US
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
public version.pdf
Size:
8.36 MB
Format:
application/pdf
Description:
Resource type