Fetch the repository succeeded.
This article illustrates how to change train program to an EDL distill train, and run student with fixed teacher or dynamic teacher.
# 1. define an input represent teacher prediction
soft_label = fluid.data(name='soft_label', shape=[None, 10], dtype='float32')
inputs.append(soft_label)
# 2. define DistillReader
dr = DistillReader(ins=['img', 'label'], predicts=['fc_0.tmp_2'])
train_reader = dr.set_sample_list_generator(train_reader)
# 3. define distill loss
distill_loss = fluid.layers.cross_entropy(
input=prediction, label=soft_label, soft_label=True)
distill_loss = fluid.layers.mean(distill_loss)
loss = distill_loss
# Start distill train.
# data includes the original reader input and the prediction results obtained from the teacher,
# that is (img, label, soft_label)
for data in train_reader():
metrics = exe.run(main_program, feed=data, fetch_list=[loss, acc])
python -m paddle_serving_server_gpu.serve \
--model TEACHER_MODEL \
--port TEACHER_PORT \
--gpu_ids 0
set_fixed_teacher
to set fixed teacher.# see example/distill/mnist_distill/train_with_fleet.py
dr = DistillReader(ins=reader_ins, predicts=teacher_predicts)
dr.set_fixed_teacher(args.distill_teachers)
train_reader = dr.set_sample_list_generator(train_reader)
Run student.
python train_with_fleet.py \
--use_distill_service True \
--distill_teachers TEACHER_IP:TEACHER_PORT
In addition to the teacher and student, a discovery service and a database is required.
Once the database and discovery service is deployed, they can be used permanently for different students and teachers.
redis-server
python -m paddle_edl.distill.redis.balance_server \
--db_endpoints REDIS_HOST:REDIS_PORT \
--server DISCOVERY_IP:DISCOVERY_PORT
python -m paddle_edl.distill.redis.server_register \
--db_endpoints REDIS_HOST:REDIS_PORT \
--service_name TEACHER_SERVICE_NAME \
--server TEACHER_IP:TEACHER_PORT
set_dynamic_teacher
get dynamic teacher from discovery service.dr = DistillReader(ins=reader_ins, predicts=teacher_predicts)
dr.set_dynamic_teacher(DISCOVERY_IP:DISCOVERY_PORT, TEACHER_SERVICE_NAME)
train_reader = dr.set_sample_list_generator(train_reader)
The run student code.
python train_with_fleet.py --use_distill_service True
We have built the docker images for you and you can start a demo on Kubernetes immediately: TBD
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。