ourloha.blogg.se

Relative entropy
Relative entropy







The generator starts generating objects with the goal of making something similar to our data set. The objective is to teach the model to recognize these objects. The data gives examples of the objects that we want to work with. So, apart from our model, generator + discriminator, we must have the data. Of course, the key here is that we are interested in a particular type of objects, and the discriminator and the generator deal with the same type of objects. It could be any digital representation of a physical entity, process, or a piece of information.

relative entropy

I purposely use the word “object” instead of “image” because I want us to think bigger. Usually, a generator and a discriminator have a similar architecture, as we shall see in an example shortly. Image by the author.Īs in the case of the discriminator, the generator function better have some complexity. Schematic representation of a generator part of a GAN. They consist of two parts - a discriminator and a generator.Ī discriminator is a function that takes in an object and converts it into a number. Use the trained model to classify another set of objects and see if we can interpret this classification.Train the WGAN on a set of objects with designed properities.

relative entropy

Refresh our knowledge of GANs and the arguments for using WGANs as far as the training strategy is concerned.For this purpose, we will stick to the following plan Meanwhile, I would like to take another look at this wonderful machinery and investigate its possible use for classification and embedding. I might be wrong, so feel free to correct me.

#Relative entropy how to#

Everybody realizes how powerful and cool they are, few know how to train them, and even fewer can actually find any use for them for a practical task. It has been my impression, that in the immense space of Artificial Intelligence (AI) concepts and tools, Generative Adversarial Networks (GANs) stand aside as an untamed beast.







Relative entropy