> Is Pre-trained models and Our Own Train Data Fine?

Hi,

Is it possible to work with pre-trained classification models? I also wonder if it's okay to use our own train data with the train data provided by the competitiion? Or, must we work with the only train data provided by the competition?

Best regards

Posted by: go @ Jan. 28, 2017, 11:55 a.m.

Hi
Thanks for your point... You must indeed use only the provided data. However you can use your own training data and see the result on the data provided via this challenge and then submit your method and new results to the workshop. I am sure that would be pretty interesting.
Best
Shahab

Posted by: icv @ Jan. 28, 2017, 8:54 p.m.

Thank you for your quick reply. So, how about pre-trained models? Let's say I have a pre-trained model such as AlexNet or a custom network, which I trained previously with some emotion dataset for emotion estimation task. Is it possible to re-train such models with just train data of the competition and to join the competition with resulting models?

Posted by: go @ Jan. 29, 2017, 1:13 p.m.

HI!

Once you submit us your code we will run it using only our own data. This however might results in different results than your submitted files. It would be better to stick to the provided data.

Posted by: icv @ Jan. 31, 2017, 6:41 p.m.

Hi,
It's not very clear if you need code to train the model?
Let's assume I will upload and make publicly available some model to recognize 50 emotions along with the code to use this model for inference (i.e. on new test images), but WITHOUT the code to train this model. I can only submit description how and using which data it was trained, so that training can be reproduced by others, but for some reasons I am not able to provide this training code.
Will my entry be considered?

Best,
Boris

Posted by: bknyaz @ Feb. 13, 2017, 12:30 p.m.

Hi
You can submit the code privately to us and we will run it. However, if you do not agree, you cannot participate.
Best
iCV

Posted by: icv @ Feb. 13, 2017, 1:53 p.m.
Post in this thread