به سوی توصیه های اجتماعی حفظ حریم خصوصی تحت تنظیمات حریم خصوصی شخصی / Towards privacy preserving social recommendation under personalized privacy settings

به سوی توصیه های اجتماعی حفظ حریم خصوصی تحت تنظیمات حریم خصوصی شخصی Towards privacy preserving social recommendation under personalized privacy settings

  • نوع فایل : کتاب
  • زبان : انگلیسی
  • ناشر : Springer
  • چاپ و سال / کشور: 2018

توضیحات

رشته های مرتبط مهندسی کامپیوتر، فناوری اطلاعات
گرایش های مرتبط امنیت اطلاعات، اینترنت و شبکه های گسترده
مجله تار جهان گستر وب – World Wide Web
دانشگاه Institute of Computing Technology – Chinese Academy of Sciences – China
شناسه دیجیتال – doi https://doi.org/10.1007/s11280-018-0620-z
منتشر شده در نشریه اسپرینگر
کلمات کلیدی انگلیسی Differential privacy, Social recommendation, Ranking, Personalized privacy settings

Description

1 Introduction A recommender system has become an imperative component of myriad online commercial platforms. With increasing popularity of social networks, recommender systems now can take advantage of these rich social relationships to improve recommendation effectiveness [34, 37, 43]. This new type of social relationships-based recommender system (i.e., social recommendation for short), however, suffers from a new source of privacy leakage. For example, by observing a victim user’s feedbacks on products such as adult or medical items, the adversary may infer the victim’s private sex inclination or health condition [8], and may further abuse the private information for financial benefits [29]. In practice, a privacy-preserving social recommender system, which can utilize social relationships to produce more accurate recommendation results without sacrificing privacy of users being involved, is very necessary. There were a few mechanisms designed for this purpose. However, they are all problematic as analyzed in the following. First, a few existing efforts [13, 22] heavily rely on an assumption that the recommender is fully trusted. They neglect the fact that the recommender itself may be untrusted and may conduct malicious behaviors, causing serious privacy leakage. Second, a few works [11, 38] rely on cryptography to prevent users’ exact inputs from being leaked to the untrusted recommender. Nonetheless, it has been shown that attackers can still infer sensitive information about the victim users based on their influence on the final results [25]. In addition, the cryptographic process is usually expensive and may bring large computational overhead. Third, a few works [12, 13, 24] rely on friends’ history feedbacks to make recommendations, but do not differentiate sensitive and non-sensitive feedbacks and simply treat them equally. In practice, social media sites such as IMDB and Facebook (Figure 11) allow users to specify the visibility of their feedbacks on products. Treating all the feedbacks as equally sensitive and not exposing non-sensitive feedbacks for security, will make it difficult to attract commoninterest friends and make effective recommendations, sacrificing user experience in the long run. Resolving all the aforementioned defects is necessary for building an effective privacypreserving social recommender system, which however is a very challenging task due to the following reasons: First, to relax the assumption that a recommender is fully trustful, we need to change the recommender system from a fully centralized manner to a semi-centralized manner. In other words, instead of fully relying on the recommender, we now allow users and the recommender to collaborate with each other for recommendation. Specifically, users can have access to both the sensitive and the non-sensitive feedbacks, while the recommender can only have access to the non-sensitive feedbacks, and they interact to make the final recommendation. In such a semi-centralized manner, private information may still be leaked during each interaction between the recommender and the user, and eliminating such leakage is necessary yet challenging. Second, to avoid using expensive cryptographic techniques, differential privacy [5] can be used to provide provable privacy guarantee with a small computational overhead. However, differential privacy requires adding noise which may degrade recommendation effectiveness.
اگر شما نسبت به این اثر یا عنوان محق هستید، لطفا از طریق "بخش تماس با ما" با ما تماس بگیرید و برای اطلاعات بیشتر، صفحه قوانین و مقررات را مطالعه نمایید.

دیدگاه کاربران


لطفا در این قسمت فقط نظر شخصی در مورد این عنوان را وارد نمایید و در صورتیکه مشکلی با دانلود یا استفاده از این فایل دارید در صفحه کاربری تیکت ثبت کنید.

بارگزاری