Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cross Entropy #26

Open
wants to merge 17 commits into
base: master
Choose a base branch
from

Conversation

CeriAnneLaureyssens
Copy link

First check-up file
There still has to be done some layouting, inserting some more descriptive comments to the code itself, some examples on how and when to use the code, and (maybe) adding some visuals

First check-up file
There still has to be done some layouting, inserting some more descriptive comments to the code itself, some examples on how and when to use the code, and (maybe) adding some visuals
@MichielStock
Copy link
Owner

Looks nice!

  • can you rename your notebook to just crossentropy?
  • to keep in tune with the course, I prefer 'objective' to 'loss'
  • Int64(floor(0.1*N)) => floor(Int, 0.1*N)
  • you might look to 'Algorithms for Optimization' for some additional pointers
  • explain it in words and with some formulas, add a couple of plots to explain.
  • why not use a fixed number of samples? Similarly, I would just use a fixed number of elites: keep it simple!
  • @assert length(losses) == N looks quite unnecessary
  • all(weights .≈ 0.) this won't work (check docs) better use all(abs.(weights) .< 1e8)
  • you can test them on some of the toy examples we used in the course. Would be fun to visualize the path!

Good luck!

@MichielStock
Copy link
Owner

Something is wrong with the notebook file. Better fix this!

@CeriAnneLaureyssens
Copy link
Author

Should be okay now, I made a mistake by deleting my old file and removing the folder completely. My bad

@MichielStock
Copy link
Owner

take care of mathematics: \text{argmin} \log (so they are correctly typeset in math mode). If you use images, just copy the link ![](link to figure), straight form the github repo

@menvdamm
Copy link

Hi!

I just took a look at your notebook and I'm very impressed with your coding skills, nice job! The subject you chose definitely doesn't seem like one of the easiest, so I think the project might benefit from a little bit more explanation or examples about what this technique is actually concretely used for.

Some other small remarks:

  • where you describe the implementation of the CE method: maybe put stuff like 'θgstar' and 'θiteration' in math mode with subscript to make it look a little bit nicer
  • in the importance sampling example, the code chunk of the loss function: a small typo 'of intereste' instead of 'of interest'
  • like Michiel already said changing 'loss function' to 'objective function' might make it a little easier to understand

@Triana-Forment
Copy link

  • Using $$f$$ and $$g$$ for the names of the probability distribution, as well as for each variable used in a formula and then described in text (D, E, X…)

  • $θ_{g}*$ instead of θ gstar

  • When you talk about some code functions, I would use Random.seed!(1234), instead of Random.seed!(1234)

  • This are just minor aesthetic corrections that facilitates the lecture.

  • I have not clear the concept of batched, so I tried to add “batched = true” as argument for running “cross_entropy_method” function (is_dist_opt = cross_entropy_method(l, is_dist_0; max_iter = 4, N=10000, weight_fn = w, verbose = true, batched = true) and I’ve obtained the error:

ArgumentError: invalid index: :x of type Symbol

to_index(::Symbol)@indices.jl:300
to_index(::Vector{Dict{Symbol, Vector{Int64}}}, ::Symbol)@indices.jl:277
[email protected]:333[inlined]
[email protected]:325[inlined]
[email protected]:1170[inlined]
l(::Dict{Symbol, Vector{Distributions.Sampleable}}, ::Vector{Dict{Symbol, Vector{Int64}}})@Other: 3
var"#cross_entropy_method#1"(::Int64, ::Int64, ::Float64, ::Int64, ::Int64, ::typeof(Main.workspace6.w), ::Random._GLOBAL_RNG, ::Bool, ::Bool, ::Bool, ::Main.workspace4.var"#7#13", ::typeof(Main.workspace4.cross_entropy_method), ::typeof(Main.workspace5.l), ::Dict{Symbol, Vector{Distributions.Sampleable}})@Other: 29
top-level scope@Local: 5[inlined]

So maybe you could provide an example for running the function with the argument “batched = true”

  • README.md file the second and third paragraphs I would add them in the notebook, to clarify what has the dictionary that represents the distributions.

  • References are missing.

For the rest, good job, it doesn’t seem like an easy algorithm to implement (e.g. the package Distributions it’s totally new for me…).

@CeriAnneLaureyssens
Copy link
Author

@MichielStock I have uploaded my final version on my GitHub repo and on Ufora

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants