Volatile multi-armed bandits for guaranteed targeted social crawling

Zahy Bnaya, Rami Puzis, Roni Stern, Ariel Felner

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

17 Scopus citations

Abstract

We introduce a new variant of the multi-armed bandit problem, called Volatile Multi-Arm Bandit (VMAB). A general policy for VMAB is given with proven regret bounds. The problem of collecting intelligence on profiles in social networks is then modeled as a VMAB and experimental results show the superiority of our proposed policy.

Original languageEnglish
Title of host publicationLate-Breaking Developments in the Field of Artificial Intelligence - Papers Presented at the 27th AAAI Conference on Artificial Intelligence, Technical Report
PublisherAI Access Foundation
Pages8-10
Number of pages3
ISBN (Print)9781577356288
StatePublished - 2013
Externally publishedYes
Event27th AAAI Conference on Artificial Intelligence, AAAI 2013 - Bellevue, WA, United States
Duration: 14 Jul 201318 Jul 2013

Publication series

NameAAAI Workshop - Technical Report
VolumeWS-13-17

Conference

Conference27th AAAI Conference on Artificial Intelligence, AAAI 2013
Country/TerritoryUnited States
CityBellevue, WA
Period14/07/1318/07/13

Fingerprint

Dive into the research topics of 'Volatile multi-armed bandits for guaranteed targeted social crawling'. Together they form a unique fingerprint.

Cite this