1 Star 0 Fork 33

qingshan / PaddleRec

forked from PaddlePaddle / PaddleRec 
Create your Gitee Account
Explore and code with more than 12 million developers,Free private repositories !:)
Sign up
Clone or Download
contribute
Sync branch
Cancel
Notice: Creating folder will generate an empty file .keep, because not support in Git
Loading...
README
Apache-2.0

(简体中文|English)

What is recommendation system ?

  • Recommendation system helps users quickly find useful and interesting information from massive data.

  • Recommendation system is also a silver bullet to attract users, retain users, increase users' stickness or conversionn.

    Who can better use the recommendation system, who can gain more advantage in the fierce competition.

    At the same time, there are many problems in the process of using the recommendation system, such as: huge data, complex model, inefficient distributed training, and so on.

What is PaddleRec ?

  • A quick start tool of search & recommendation algorithm based on PaddlePaddle
  • A complete solution of recommendation system for beginners, developers and researchers.
  • Recommendation algorithm library including content-understanding, match, recall, rank, multi-task, re-rank etc.

Getting Started

Environmental requirements

  • Python 2.7/ 3.5 / 3.6 / 3.7 , Python 3.7 is recommended ,Python in example represents Python 3.7 by default

  • PaddlePaddle >=2.0

  • operating system: Windows/Mac/Linux

    Linux is recommended for distributed training

Installation

  • Install by pip in GPU environment
    python -m pip install paddlepaddle-gpu==2.0.0 
  • Install by pip in CPU environment
    python -m pip install paddlepaddle # gcc8 

For download more versions, please refer to the installation tutorial Installation Manuals

Download PaddleRec

git clone https://github.com/PaddlePaddle/PaddleRec/
cd PaddleRec

Quick Start

We take the dnn algorithm as an example to get start of PaddleRec, and we take 100 pieces of training data from Criteo Dataset:

python -u tools/trainer.py -m models/rank/dnn/config.yaml # Training with dygraph model
python -u tools/static_trainer.py -m models/rank/dnn/config.yaml #  Training with static model

Documentation

Background

Introductory tutorial

Advanced tutorial

FAQ

Support model list

Type Algorithm CPU GPU Parameter-Server Multi-GPU version Paper
Content-Understanding TextCnn x >=2.1.0 [EMNLP 2014]Convolutional neural networks for sentence classication
Content-Understanding TagSpace x >=2.1.0 [EMNLP 2014]TagSpace: Semantic Embeddings from Hashtags
Match DSSM x >=2.1.0 [CIKM 2013]Learning Deep Structured Semantic Models for Web Search using Clickthrough Data
Match MultiView-Simnet x >=2.1.0 [WWW 2015]A Multi-View Deep Learning Approach for Cross Domain User Modeling in Recommendation Systems
Match Match-Pyramid x >=2.1.0 [2016]Text Matching as Image Recognition
Recall TDM >=1.8.0 >=1.8.0 1.8.5 [KDD 2018]Learning Tree-based Deep Model for Recommender Systems
Recall fasttext x x 1.8.5 [EACL 2017]Bag of Tricks for Efficient Text Classification
Recall MIND x x >=2.1.0 [2019]Multi-Interest Network with Dynamic Routing for Recommendation at Tmall
Recall Word2Vec x >=2.1.0 [NIPS 2013]Distributed Representations of Words and Phrases and their Compositionality
Recall DeepWalk x x >=2.1.0 [SIGKDD 2014]DeepWalk: Online Learning of Social Representations
Recall SSR 1.8.5 [SIGIR 2016]Multi-Rate Deep Learning for Temporal Recommendation
Recall Gru4Rec 1.8.5 [2015]Session-based Recommendations with Recurrent Neural Networks
Recall Youtube_dnn 1.8.5 [RecSys 2016]Deep Neural Networks for YouTube Recommendations
Recall NCF >=2.1.0 [WWW 2017]Neural Collaborative Filtering
Recall GNN 1.8.5 [AAAI 2019]Session-based Recommendation with Graph Neural Networks
Recall RALM 1.8.5 [KDD 2019]Real-time Attention Based Look-alike Model for Recommender System
Rank Logistic Regression x >=2.1.0 /
Rank Dnn >=2.1.0 /
Rank FM x >=2.1.0 [IEEE Data Mining 2010]Factorization machines
Rank FFM x >=2.1.0 [RECSYS 2016]Field-aware Factorization Machines for CTR Prediction
Rank FNN x 1.8.5 [ECIR 2016]Deep Learning over Multi-field Categorical Data
Rank Deep Crossing x 1.8.5 [ACM 2016]Deep Crossing: Web-Scale Modeling without Manually Crafted Combinatorial Features
Rank Pnn x 1.8.5 [ICDM 2016]Product-based Neural Networks for User Response Prediction
Rank DCN x >=2.1.0 [KDD 2017]Deep & Cross Network for Ad Click Predictions
Rank NFM x 1.8.5 [SIGIR 2017]Neural Factorization Machines for Sparse Predictive Analytics
Rank AFM x 1.8.5 [IJCAI 2017]Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks
Rank DMR x x >=2.1.0 [AAAI 2020]Deep Match to Rank Model for Personalized Click-Through Rate Prediction
Rank DeepFM x >=2.1.0 [IJCAI 2017]DeepFM: A Factorization-Machine based Neural Network for CTR Prediction
Rank xDeepFM x >=2.1.0 [KDD 2018]xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems
Rank DIN x >=2.1.0 [KDD 2018]Deep Interest Network for Click-Through Rate Prediction
Rank DIEN x >=2.1.0 [AAAI 2019]Deep Interest Evolution Network for Click-Through Rate Prediction
Rank dlrm x >=2.1.0 [CoRR 2019]Deep Learning Recommendation Model for Personalization and Recommendation Systems
Rank DeepFEFM x >=2.1.0 [arXiv 2020]Field-Embedded Factorization Machines for Click-through rate prediction
Rank BST x 1.8.5 [DLP-KDD 2019]Behavior Sequence Transformer for E-commerce Recommendation in Alibaba
Rank AutoInt x 1.8.5 [CIKM 2019]AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks
Rank Wide&Deep x >=2.1.0 [DLRS 2016]Wide & Deep Learning for Recommender Systems
Rank FGCNN 1.8.5 [WWW 2019]Feature Generation by Convolutional Neural Network for Click-Through Rate Prediction
Rank Fibinet 1.8.5 [RecSys19]FiBiNET: Combining Feature Importance and Bilinear feature Interaction for Click-Through Rate Prediction
Rank Flen 1.8.5 [2019]FLEN: Leveraging Field for Scalable CTR Prediction
Multi-Task PLE >=2.1.0 [RecSys 2020]Progressive Layered Extraction (PLE): A Novel Multi-Task Learning (MTL) Model for Personalized Recommendations
Multi-Task ESMM >=2.1.0 [SIGIR 2018]Entire Space Multi-Task Model: An Effective Approach for Estimating Post-Click Conversion Rate
Multi-Task MMOE >=2.1.0 [KDD 2018]Modeling Task Relationships in Multi-task Learning with Multi-gate Mixture-of-Experts
Multi-Task ShareBottom >=2.1.0 [1998]Multitask learning
Multi-Task Maml x x >=2.1.0 [PMLR 2017]Model-agnostic meta-learning for fast adaptation of deep networks
Re-Rank Listwise x 1.8.5 [2019]Sequential Evaluation and Generation Framework for Combinatorial Recommender System

Community


Release License Slack

Version history

  • 2021.11.19 - PaddleRec v2.2.0
  • 2021.05.19 - PaddleRec v2.1.0
  • 2021.01.29 - PaddleRec v2.0.0
  • 2020.10.12 - PaddleRec v1.8.5
  • 2020.06.17 - PaddleRec v0.1.0
  • 2020.06.03 - PaddleRec v0.0.2
  • 2020.05.14 - PaddleRec v0.0.1

License

Apache 2.0 license

Contact us

For any feedback, please propose a GitHub Issue

You can also communicate with us in the following ways:

  • QQ group id:861717190
  • Wechat account:wxid_0xksppzk5p7f22
  • Remarks REC add group automatically

     

PaddleRec QQ Group               PaddleRec Wechat account

Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

About

大规模推荐算法库,包含推荐系统经典及最新算法LR、Wide&Deep、DSSM、TDM、MIND、Word2Vec、DeepWalk、SSR、GRU4Rec、Youtube_dnn、NCF、GNN、FM、FFM、DeepFM、DCN、DIN、DIEN、DLRM、MMOE、PLE、ESMM、MAML、xDeepFM、DeepFEFM、NFM、AFM、RALM、Deep Crossing、PNN expand collapse
Python
Apache-2.0
Cancel

Releases

No release

Contributors

All

Activities

Load More
can not load any more
Python
1
https://gitee.com/leeyw/PaddleRec.git
git@gitee.com:leeyw/PaddleRec.git
leeyw
PaddleRec
PaddleRec
release/2.2.0

Search