LOCAL CACHING
The current paradigm of cloud computing is the
result of a progressive shift in the balance between
data storage and data transfer: information is
stored and processed wherever it is most convenient
and inexpensive because the marginal cost
of transferring it has become negligible, at least
on wire line networks [2]. For wireless devices,
though, this cost is not always negligible. The
understanding that mobile users are subject to
sporadic abundance of connectivity amidst stretches
of deprivation is hardly new, and the natural
idea of opportunistically leveraging the former to
alleviate the latter has been entertained since the
1990s [3]. However, this idea of caching massive
amounts of data at the edge of the wire line network
right before the wireless hop only applies to
delay-tolerant traffic, and thus made little sense
in voice-centric systems. Caching might finally
make sense now in data-centric systems [4].
Thinking ahead, it is easy to envision mobile
devices with truly vast amounts of memory.
Under this assumption, and given that a substantial
share of the data that circulates wireless
corresponds to the most popular audio/video/
social content that is in vogue at a given time, it
is clearly inefficient to transmit such content via
uni cast, but it is frustratingly impossible to resort
to multicast because the demand is asynchronous.
We hence see local caching as an important
alternative, at both the radio access network edge
(e.g., at small cells) and mobile devices, also
thanks to enablers such as mm Wave and D2D.