* Kile and a LaTeX system;
* gcc 3.3, or 4.0, and Kdevelop;
* the Boost development libraries (don't forget that one);
* Matlab 7.0.4 with all Toolboxes;
* Dia;
* a way to get the results on the Internet (CD writing, modem).
Tuesday, August 23, 2005
Monday, August 22, 2005
Last stuff
* Taxes (send requested. info)
* Taxes (phone)
* Forwarding address
* Account Deletion
* Phone Gruber
* Deposit
* Bank ?
* Taxes (phone)
* Forwarding address
* Account Deletion
* Phone Gruber
* Deposit
* Bank ?
Wednesday, August 17, 2005
Tuesday, August 16, 2005
to do tuesday
* go to the bank figure what is going wrong...
* ubc keys
* phone medical care
* phone taxes, fax
* phone rto
* see nando
* ubc keys
* phone medical care
* phone taxes, fax
* phone rto
* see nando
Friday, June 24, 2005
CV update
* Write the CV in English.
* Find a better way to specify potential tables... in Matlab
* Recode the MP algorithm with the new way of doing messages (??)
* Find a better way to specify potential tables... in Matlab
* Recode the MP algorithm with the new way of doing messages (??)
Thursday, June 23, 2005
At work again
For tomorrow:
* debug the loopy belief implementation
* Implement and test Gibbs, Tree Gibbs
* finish documentation
* debug the loopy belief implementation
* Implement and test Gibbs, Tree Gibbs
* finish documentation
Thursday, May 05, 2005
Tree Gibbs working with FFBS
* Corrected last bugs and errors. It is correctly working right now, without any initialization problems.
* Read the Doucet and West statistical paper on non-linear state models. Interesting but in fact the framework seems to be significantly different from our framework. First they have densities reflecting exactly conditional probabilities and not potentials, etc... They use particle filters to compute a single trajectory as a single particle.
-> speack with Nando, see what next
* Read the Doucet and West statistical paper on non-linear state models. Interesting but in fact the framework seems to be significantly different from our framework. First they have densities reflecting exactly conditional probabilities and not potentials, etc... They use particle filters to compute a single trajectory as a single particle.
-> speack with Nando, see what next
Wednesday, May 04, 2005
Implementation challenges
* Tried to correct the initialization bug found yesterday. The limits of my current design and bad knowledge of C++ are becoming more evident each passing day.
* In particular, templates don't work very well with polymorphism. I think that in fact templates should be used much more rarely, in containers for example. If you know in advance which type you are gonna use for an object, you don't need tempaltes. I now think templates should rather be added later than sooner
* Learnt than a copy via a pointer that is pointing to a base class, is not possible, you have to write your own virtual Copy or Clone function
-> finish the implementation of the workarounds for the bugs discovered
* In particular, templates don't work very well with polymorphism. I think that in fact templates should be used much more rarely, in containers for example. If you know in advance which type you are gonna use for an object, you don't need tempaltes. I now think templates should rather be added later than sooner
* Learnt than a copy via a pointer that is pointing to a base class, is not possible, you have to write your own virtual Copy or Clone function
-> finish the implementation of the workarounds for the bugs discovered
Monday, May 02, 2005
Implementation work started again
* Read the PAMPAS paper in the reading group. It is interesting, and just loopy with some particle filters, essentially.
* Found some issues related to initialization, apparently, since Gibbs sampler produces different results if it is run after a Tree_Gibbs_Sampler. This is because, probably, the potentials on the nodes, instead of being Constant, are left equal to something...
* Implemented FFBS for Discrete cases and pairwise case. Has to be tested
* Verified that Peter explanation on FFBS has some errors.
-> correct the inialization issues
-> lots of code clean ups ?
* Found some issues related to initialization, apparently, since Gibbs sampler produces different results if it is run after a Tree_Gibbs_Sampler. This is because, probably, the potentials on the nodes, instead of being Constant, are left equal to something...
* Implemented FFBS for Discrete cases and pairwise case. Has to be tested
* Verified that Peter explanation on FFBS has some errors.
-> correct the inialization issues
-> lots of code clean ups ?
Wednesday, April 27, 2005
Forward Filtering Backward Sampling
* Talked with Nando. Learned about FFBS: this allows one to sample from the joint distribution, not only from the marginals, in chains and trees. This is really important and has to be understood and implemented.
-> learn in depth FFBS
-> learn in depth FFBS
Tuesday, April 26, 2005
Tree Gibbs sampler ready for review
* More tests on the Tree Gibbs Sampler
* Spent some time on Design Changes
-> Talk to Nando to verify the work
* Spent some time on Design Changes
-> Talk to Nando to verify the work
Saturday, April 23, 2005
Tree Gibbs Sampler and Gibbs Sampler agree!!
* A big milestone was reached today: after some more bugs were discovered and fixed, the Tree Gibbs Sampler seems to work (at least on a simple 10 nodes graph)! Hurray!
* I didn't have time to read some C++, but I spent time learning more about C++ debuggers (on Linux). DDD sucks, but seems to be the best; you have the option to stick the command tool to the top, which improves the situation (a bit). There are still many things that bother me. I tried the GPS debugger, which seems more modern but, at first glance, is completely worthless (you can't even break on a variable...)
* I developped some self made C++ practices:
* Try to avoid loops using integers, use iterators whenever you can, the type system is much more likely to catch errors!
* If you use more than one for loop in a function, give names to the looping variables, don't just use i, j, etc!
-> implement random graph generation for a single connected component
-> cosmetic changes...
-> speack with Nando and see what's up next
* I didn't have time to read some C++, but I spent time learning more about C++ debuggers (on Linux). DDD sucks, but seems to be the best; you have the option to stick the command tool to the top, which improves the situation (a bit). There are still many things that bother me. I tried the GPS debugger, which seems more modern but, at first glance, is completely worthless (you can't even break on a variable...)
* I developped some self made C++ practices:
* Try to avoid loops using integers, use iterators whenever you can, the type system is much more likely to catch errors!
* If you use more than one for loop in a function, give names to the looping variables, don't just use i, j, etc!
-> implement random graph generation for a single connected component
-> cosmetic changes...
-> speack with Nando and see what's up next
Friday, April 22, 2005
More debugging
* Cosmetic changes, implemented a lot of debugging functions (display_subgraph, etc...)
* Found and fixed the crucial bug affecting message passing (which was induced by yesterday's fix). MP and Simple Gibbs now agree for MRFs !
* Started referring to MRFs as "pairwise graphs" since an MRF isn't (apparently) restricted to pairwise potentials
* Still bugs in Tree Gibbs. I found a *major one*: we need to allow the Potential on the nodes to be SingleDiscrete and not Constant!! Or we should find a better way of implementing that...
* New ideas for a better interface in RandomVariable (less friends, implementation better encapsulated). We could use each sample as "prior knowledge". There is still the problem of translating that prior knowledge into probabilities (that would have to be a public function). Maybe I'll spend some time thinking about it.
-> correct the Tree Gibbs Sampler
-> implement a random graph creation algorithm, that creates connected graphs (use Boost algrithms)
-> if time allows, learn more about C++ (Parashift FAQ)
* Found and fixed the crucial bug affecting message passing (which was induced by yesterday's fix). MP and Simple Gibbs now agree for MRFs !
* Started referring to MRFs as "pairwise graphs" since an MRF isn't (apparently) restricted to pairwise potentials
* Still bugs in Tree Gibbs. I found a *major one*: we need to allow the Potential on the nodes to be SingleDiscrete and not Constant!! Or we should find a better way of implementing that...
* New ideas for a better interface in RandomVariable (less friends, implementation better encapsulated). We could use each sample as "prior knowledge". There is still the problem of translating that prior knowledge into probabilities (that would have to be a public function). Maybe I'll spend some time thinking about it.
-> correct the Tree Gibbs Sampler
-> implement a random graph creation algorithm, that creates connected graphs (use Boost algrithms)
-> if time allows, learn more about C++ (Parashift FAQ)
Thursday, April 21, 2005
Debugging
* Implemented Boost random system, it is nice. It wasn't too difficult
* Found a critical bug in the message passing algorithm for MRFs (the messages were sent when they shouldn't). Fixed, but we still probably have bugs. Gibbs and message passing still don't agree...
* Cosmetic changes, almost every template is manually instantiated now, which is good. No more #include "xxx.cc" directives
-> continue debugging...
* Found a critical bug in the message passing algorithm for MRFs (the messages were sent when they shouldn't). Fixed, but we still probably have bugs. Gibbs and message passing still don't agree...
* Cosmetic changes, almost every template is manually instantiated now, which is good. No more #include "xxx.cc" directives
-> continue debugging...
Wednesday, April 20, 2005
Tree Gibbs sampler finished
* finished the redesign of the RandomVariable class, it now has a lot of friends. Not ideal but I must concentrate on the implementation for now...
* as always, small cosmetic changes, template instantiations, etc...
* The executable is now 20 MB (!) in size! I don't understand how it can get that big...
* Everything is up and running, but I *feel* unknown bugs lurk in the code. We have big amounts of difference between Gibbs and tree Gibbs and we shouldn't... Some bug searching is now needed.
-> bug tracking, bug tracking, bug tracking
-> implement the Boost random number generator
* as always, small cosmetic changes, template instantiations, etc...
* The executable is now 20 MB (!) in size! I don't understand how it can get that big...
* Everything is up and running, but I *feel* unknown bugs lurk in the code. We have big amounts of difference between Gibbs and tree Gibbs and we shouldn't... Some bug searching is now needed.
-> bug tracking, bug tracking, bug tracking
-> implement the Boost random number generator
Monday, April 18, 2005
Tree Gibbs Sampler running
* Bugs in Boost CVS fixed, now we have all that is needed (I think)
* Learned about friends, partial specializations of classes; you can't just partial specialize the member functions, if you haven't partial specialized the class (though these functions can be taken out of the partially specialized class definition).
* Manually instantiated the Node class
* Cosmetic changes thoughout the code
* Started redesigning the RandomVariable class / interface
* I have the tree gibbs sampler running entirely. Now need some comparisons to squash the remaining bugs...
-> finish the redesign of the RV class
-> finish completely at last the Tree Gibbs sampler
* Learned about friends, partial specializations of classes; you can't just partial specialize the member functions, if you haven't partial specialized the class (though these functions can be taken out of the partially specialized class definition).
* Manually instantiated the Node class
* Cosmetic changes thoughout the code
* Started redesigning the RandomVariable class / interface
* I have the tree gibbs sampler running entirely. Now need some comparisons to squash the remaining bugs...
-> finish the redesign of the RV class
-> finish completely at last the Tree Gibbs sampler
Saturday, April 16, 2005
Bugs in Boost CVS
* Some bugs were sorted out, but there are still bugs in the bundled properties implementation for subgraphes. Request for fixes in boost-users
* Changes to the tree partition algorithm, related to memory control/ management. It now takes a subgraph as argument
* Cosmetic changes regarding *::vertex_bundled, etc... Boost CVS allowed the use of traits
* Continued to manually instantiate templates
-> finish the Tree Gibbs sampler (debug, etc...)
-> redesign the Random Variable implementation and interface
-> update to Boost CVS at UBC
* Changes to the tree partition algorithm, related to memory control/ management. It now takes a subgraph as argument
* Cosmetic changes regarding *::vertex_bundled, etc... Boost CVS allowed the use of traits
* Continued to manually instantiate templates
-> finish the Tree Gibbs sampler (debug, etc...)
-> redesign the Random Variable implementation and interface
-> update to Boost CVS at UBC
Friday, April 15, 2005
Update to Boost CVS
* Started the implementation of Tree Gibbs sampler, updated to Boost CVS because I needed bundled properties with subgraphs.
* Problems with type_traits in Boost CVS, that's a showstopper
* Thought about the mpossible reimplementation of large parts of code to better encapsulate; need to redesign interfaces...
-> sort out the Boost issue to complete implementation
* Problems with type_traits in Boost CVS, that's a showstopper
* Thought about the mpossible reimplementation of large parts of code to better encapsulate; need to redesign interfaces...
-> sort out the Boost issue to complete implementation
Thursday, April 14, 2005
Tree Gibbs sampler
* Worked on the design of the Gibbs tree sampler
* We will use subgraphs, it is problematic that bundled properties are not implemented for them
-> implement Tree Gibbs Sampling
* We will use subgraphs, it is problematic that bundled properties are not implemented for them
-> implement Tree Gibbs Sampling
Wednesday, April 13, 2005
Read of Chapter 22
* started reading chapter 22 of Jordans on variational methods. Seems very interesting
-> finish reading of Chapter 22
-> finish reading of Chapter 22
Tuesday, April 12, 2005
Tree partition finished
* Finished writing and debugging the input C++ code with streams
* Finished debugging the tree partition algorithm. It works fine with a 5*5 MRF, and with the test graph written for the draft paper back in Russia.
* Cosmetic code changes, creation of new files, etc...
-> write a Rao Blackwell function (or generalized Gibbs sampler) to work with trees, in the pairwise potential case
* Finished debugging the tree partition algorithm. It works fine with a 5*5 MRF, and with the test graph written for the draft paper back in Russia.
* Cosmetic code changes, creation of new files, etc...
-> write a Rao Blackwell function (or generalized Gibbs sampler) to work with trees, in the pairwise potential case
Saturday, April 09, 2005
Standard C++ Library: IO Streams
* spent the day essentially learning C++ streams. Many things to keep in mind:
* I/O and parsing in C++ is a tedious task!! Almost everything has to be done by hand. It can be very buggy code, hard to maintain...
* 2 basics levels of functionality: the >> operators are for very simple tasks and parsing, for common situations. But should you need to write complex or even non trivial I/O, you need to learn about member functions (get, getline, etc... and a ton of other things)
* It may thus be worth it to learn the Boost parsing library or the Boost regexp library
* Only practice can help remember the many issues associated with I/O streams, I think.
* the C++ parashift FAQ is really well documented, bookmarked it.
* Learnt about friends, inline functions should be declared in the implementation
* Headers are not meant to disallow access to your classes, it is first a tools for the programmer, not a security measure
* downloaded a C++ Standard Library book
* wrote a test function to read graph from a file using a simple language; needs to be finished and debugged
-> finish to debug the io code
-> write a random potential and / or random variable generator
-> previous goals of yesterday
* I/O and parsing in C++ is a tedious task!! Almost everything has to be done by hand. It can be very buggy code, hard to maintain...
* 2 basics levels of functionality: the >> operators are for very simple tasks and parsing, for common situations. But should you need to write complex or even non trivial I/O, you need to learn about member functions (get, getline, etc... and a ton of other things)
* It may thus be worth it to learn the Boost parsing library or the Boost regexp library
* Only practice can help remember the many issues associated with I/O streams, I think.
* the C++ parashift FAQ is really well documented, bookmarked it.
* Learnt about friends, inline functions should be declared in the implementation
* Headers are not meant to disallow access to your classes, it is first a tools for the programmer, not a security measure
* downloaded a C++ Standard Library book
* wrote a test function to read graph from a file using a simple language; needs to be finished and debugged
-> finish to debug the io code
-> write a random potential and / or random variable generator
-> previous goals of yesterday
Friday, April 08, 2005
Template Specialization
* tried explicit template instantiation. Seems to work fine, we should maybe do that throughout the code
* Learned a lot about C++. A templated function can't have partial specialization, nor can't it have default template arguments. A workaround is just to do partial specialization on a class that can be used as a function
* finished debugging tree partition. It is now able to produce the correct result in a simple test case
* BGL: be very careful about add_edge or remove_edge, it invalidates iterators
* Made make_random_graph truly generic
* cleaned up in the code (minor, mainly converting i++ to ++i)
-> continue to test tree partition on more complex cases, clean up code and comments
-> continue to work on genericity
-> reorganize current files, continue clean-up, maybe try to do explicit instantation everywhere
* Learned a lot about C++. A templated function can't have partial specialization, nor can't it have default template arguments. A workaround is just to do partial specialization on a class that can be used as a function
* finished debugging tree partition. It is now able to produce the correct result in a simple test case
* BGL: be very careful about add_edge or remove_edge, it invalidates iterators
* Made make_random_graph truly generic
* cleaned up in the code (minor, mainly converting i++ to ++i)
-> continue to test tree partition on more complex cases, clean up code and comments
-> continue to work on genericity
-> reorganize current files, continue clean-up, maybe try to do explicit instantation everywhere
Wednesday, April 06, 2005
Template compilation, test of Tree Partitioning
* lots of things learned about template compilation. GCC is not yet optimal for this; one way out of the problem would be to implicitely instantiate the templates...
* C++ : learned about the for loops (seems funny to write that!!), and of the difference between ++i and i++. ++i is preferred.
* Tested and fixed lots of bugs about the tree partitioning. Much work still needs to be done
* Made make_random_graph generic
-> finish testing the tree partition, and make some generic changes
* C++ : learned about the for loops (seems funny to write that!!), and of the difference between ++i and i++. ++i is preferred.
* Tested and fixed lots of bugs about the tree partitioning. Much work still needs to be done
* Made make_random_graph generic
-> finish testing the tree partition, and make some generic changes
Tree partition implementation (end)
* Learned about typedefs in more detail; an iterator is a pointer so you can know the type it points to; fixed/improved things in the code.
* BGL: learned a lot about subgraphs, the root subgraph MUST be a subgraph. Mailed Boost-users with question about subgraphes
* First attempt at cleaning the code with respect to templates; learned about forward declaring a templated class; many linkers errors encountered, better understanding of tempalte instantiation and compiling
* Completed the first version of tree partitioning, needs testing.
-> pursue work on template compilation models
-> test the tree partition implementation
-> try to render things more generic (changing vecS to listS)
* BGL: learned a lot about subgraphs, the root subgraph MUST be a subgraph. Mailed Boost-users with question about subgraphes
* First attempt at cleaning the code with respect to templates; learned about forward declaring a templated class; many linkers errors encountered, better understanding of tempalte instantiation and compiling
* Completed the first version of tree partitioning, needs testing.
-> pursue work on template compilation models
-> test the tree partition implementation
-> try to render things more generic (changing vecS to listS)
Tuesday, April 05, 2005
Tree partition implementation
* Learned about the subgraph feature of BGL. Seems OK, but we will definitely need bundled properties...
* Learned more about properties map (again). You can access them, in an adjacency list, with the get() function. It suddenly make more sense. I think I begin to have a good understanding of the BGL.
* Made a first attempt at an implementation of graph partitioning. It seems to have a backtrack size of 1 (the backtrack is not cleanly coded though). It should have all the functionality of the previous work, except the simplification of one and two connectivity (which is important).
-> finish the first implementation (shouldn't require long). Compile and test
-> review Gibbs sampler, and MP code (optimize, etc...)
-> still need to learn about (nested) typedefs
* Learned more about properties map (again). You can access them, in an adjacency list, with the get() function. It suddenly make more sense. I think I begin to have a good understanding of the BGL.
* Made a first attempt at an implementation of graph partitioning. It seems to have a backtrack size of 1 (the backtrack is not cleanly coded though). It should have all the functionality of the previous work, except the simplification of one and two connectivity (which is important).
-> finish the first implementation (shouldn't require long). Compile and test
-> review Gibbs sampler, and MP code (optimize, etc...)
-> still need to learn about (nested) typedefs
Friday, April 01, 2005
Tree partition design
* Learned more about BGL: interior properties effectively means they are contained by value inside the graph. External means they are not stored in the Graph.
* To be more generic, our MP (and probably Gibbs too) code should sometimes uses a vertex index map. As it is, it works only with a vector implementation for the adjacency list, which seems to contain a built-in map.
* Two bugs in the BGL, which could be a problem. remove_edge sometimes fail, and bundled properties are not implemented for subgraphs.
-> Mail the BGL list about how do you access the builtin index map for a vector
-> work on the implementation of graph partitioning
* To be more generic, our MP (and probably Gibbs too) code should sometimes uses a vertex index map. As it is, it works only with a vector implementation for the adjacency list, which seems to contain a built-in map.
* Two bugs in the BGL, which could be a problem. remove_edge sometimes fail, and bundled properties are not implemented for subgraphs.
-> Mail the BGL list about how do you access the builtin index map for a vector
-> work on the implementation of graph partitioning
Last bugs fixed on Message Passing
* Fixed the bug discovered yesterday. Found a lot more (in the code implementing the sum product for a Potential node, and the code storing the product_messages by a Potential node).
* Gibbs sampler and MP code are at last probably bug free. At least, their results seem to converge at last...
* General clean-ups of a lot of stuff in the code.
* Thinking about the implementation of the graph partition algorithm
-> start implementing the graph partition algorithm
-> optimize the Gibbs sampler ( dont create and destruct the functors every time)
* Gibbs sampler and MP code are at last probably bug free. At least, their results seem to converge at last...
* General clean-ups of a lot of stuff in the code.
* Thinking about the implementation of the graph partition algorithm
-> start implementing the graph partition algorithm
-> optimize the Gibbs sampler ( dont create and destruct the functors every time)
Thursday, March 31, 2005
Reading old Graph partition code
* Completed Gibbs sampler generic simple code; should be working
* Minor fixes throughout the code
* Detected a bug in the Message Passing code for FGs; specifically, when a Potential node sends a message. The product of the messages is wrong in the second phase (when messages come back from the root). This is because the whole product of messages is used, and we need to remove from that product the message of the Node we are sending to. Needs to be fixed (urgent)
* Read the old code partitioning a Graph into tree for Rao-Blackwellisation (code dating back to Summer 2004). Code base is very ugly (I really code better now, which is a good thing to see :-), but understandable. Basically, we are doing a partition based on choosing the node with the lowest connectivity (equals are resolved in favor of the latest node "selected"). We have a backtrack of depth 1, if we are going to encounter a "dead-end" (previously called a full-looping vertex, but the terminology needs to be changed). More on this algorithm soon...
* Started a code clean-up (cosmetic changes), on every file.
-> fix the MP bug;
-> continue the code clean up;
-> think about the possible implementation of the old Partitioning code
* Minor fixes throughout the code
* Detected a bug in the Message Passing code for FGs; specifically, when a Potential node sends a message. The product of the messages is wrong in the second phase (when messages come back from the root). This is because the whole product of messages is used, and we need to remove from that product the message of the Node we are sending to. Needs to be fixed (urgent)
* Read the old code partitioning a Graph into tree for Rao-Blackwellisation (code dating back to Summer 2004). Code base is very ugly (I really code better now, which is a good thing to see :-), but understandable. Basically, we are doing a partition based on choosing the node with the lowest connectivity (equals are resolved in favor of the latest node "selected"). We have a backtrack of depth 1, if we are going to encounter a "dead-end" (previously called a full-looping vertex, but the terminology needs to be changed). More on this algorithm soon...
* Started a code clean-up (cosmetic changes), on every file.
-> fix the MP bug;
-> continue the code clean up;
-> think about the possible implementation of the old Partitioning code
Wednesday, March 30, 2005
Starting on Gibbs sampler Generic work
* C++: if you declare a function virtual and don't provide a definition in one of the subclasses, then it will use the one from the main class. If the function is pure of course it won't work. Also, on this matter the hierarchy works as expected: it will try to use a function defined deep in the hierarchy.
* Tested generic implementation of Message Passing, seems fine now
* Worked on the Gibbs sampler, generic implementation. The same problems as for the MP algorithm were there. The same ugly solution was used (a switch on types), but to do it elegantly template specialization would really be needed.
* Problems with headers, and templates are getting worse and worse. Have to spend time on that...
* You need to propagate, in the Gibbs sampler, the value chosen by your variable to the Potentials.
-> test and finish Gibbs sampler
-> learn about typedefs, and if it is possible to obtain the base type from a pointer type
* Tested generic implementation of Message Passing, seems fine now
* Worked on the Gibbs sampler, generic implementation. The same problems as for the MP algorithm were there. The same ugly solution was used (a switch on types), but to do it elegantly template specialization would really be needed.
* Problems with headers, and templates are getting worse and worse. Have to spend time on that...
* You need to propagate, in the Gibbs sampler, the value chosen by your variable to the Potentials.
-> test and finish Gibbs sampler
-> learn about typedefs, and if it is possible to obtain the base type from a pointer type
Tuesday, March 29, 2005
Implementation, continued
* Finished the implementation of Generic Message Passing. But the result is ugly, with a switch on type, etc... The implementation could be way better with a rewrite. For this we would need to do template specialization, mark Potential and Variable MessageNode as subclass of FacotrGraphMessageNode (for exemple). We could then rewrite a lot of stuff more nicely.
* Implemented a sum over a Potential. It is not general, but it is well implemented, I believe, for Discrete Potentials, RVs, and Factor Graph.It allows us to actually send the messages from the Potential Node.
-> test our implementation of Message Passing
-> write a generic version of the Gibbs Sampler (simple)
* Implemented a sum over a Potential. It is not general, but it is well implemented, I believe, for Discrete Potentials, RVs, and Factor Graph.It allows us to actually send the messages from the Potential Node.
-> test our implementation of Message Passing
-> write a generic version of the Gibbs Sampler (simple)
Saturday, March 26, 2005
Work on Generic Implementation of Message Passing...
* Learned a lot and a lot about C++. Implementation went on but at a slow pace
* learned that polymorphism can only be used, in essence, with pointers (probably with references too, have to check). This is also very important to note that containers are not mean to contain objects of different classes (to use with polymorphism). this is because a vector uses a constant amount of memory. If a container contains objects and not pointers, the objects will be sliced to the base class !! Which is of course a disaster at run time.
* learned about RTTI. There is a way to make dynamic downcast at runtime, if you absolutely can't use virtual functions (which is my case with the visitors for differentiating between a Factor graph and an MRF). For now I implemented that myself "poorly" (with a switch statement and integers).
* Since objects must generably be assignable in a container, using classes with const members is not a good choice if they must be stored in containers...
* Lots of other small C++ things learnt.
* As far as implementation goes:
* To support different types of Graphs on algorithms, there are basically two solutions: in the Graph containers, maintain pointers to the base class and use polymorphism. This is not ideal since in many case polymorphism can't really work and you have to do type checking yourself... The other solution is to use objects in the Graph, meaning you can't use polymorphism, and you have to explicitely specialize your visitor for the different types of Graph you can have. That seems cleaner... but maybe less generic in the long run...
* For now I have still chosen solution 1.
* Fixed an important bug in the initialization of a Potential (when adding variable). This should be ok now.
* Code for message passing on a MRF is again functional via polymorphism. Gibbs sampler is not ready
* I should really start to add typedefs in my code (and learn about it). Other question: can you get the type of something if you only have the type of the pointer?? For now there are ugly types (added by hand) throughout the Message Passing visitors...
-> continue towards the implementation of message passing for Factor Graphs (using the current chosen solution with pointers). Now we should be *really* close; only a little bit more of testing, implementation, and the implementation of routines to sum recursively over something is needed.
-> reimplement a generic Gibbs sampler.
* learned that polymorphism can only be used, in essence, with pointers (probably with references too, have to check). This is also very important to note that containers are not mean to contain objects of different classes (to use with polymorphism). this is because a vector uses a constant amount of memory. If a container contains objects and not pointers, the objects will be sliced to the base class !! Which is of course a disaster at run time.
* learned about RTTI. There is a way to make dynamic downcast at runtime, if you absolutely can't use virtual functions (which is my case with the visitors for differentiating between a Factor graph and an MRF). For now I implemented that myself "poorly" (with a switch statement and integers).
* Since objects must generably be assignable in a container, using classes with const members is not a good choice if they must be stored in containers...
* Lots of other small C++ things learnt.
* As far as implementation goes:
* To support different types of Graphs on algorithms, there are basically two solutions: in the Graph containers, maintain pointers to the base class and use polymorphism. This is not ideal since in many case polymorphism can't really work and you have to do type checking yourself... The other solution is to use objects in the Graph, meaning you can't use polymorphism, and you have to explicitely specialize your visitor for the different types of Graph you can have. That seems cleaner... but maybe less generic in the long run...
* For now I have still chosen solution 1.
* Fixed an important bug in the initialization of a Potential (when adding variable). This should be ok now.
* Code for message passing on a MRF is again functional via polymorphism. Gibbs sampler is not ready
* I should really start to add typedefs in my code (and learn about it). Other question: can you get the type of something if you only have the type of the pointer?? For now there are ugly types (added by hand) throughout the Message Passing visitors...
-> continue towards the implementation of message passing for Factor Graphs (using the current chosen solution with pointers). Now we should be *really* close; only a little bit more of testing, implementation, and the implementation of routines to sum recursively over something is needed.
-> reimplement a generic Gibbs sampler.
Friday, March 25, 2005
Factor Graphs, and message passing
* learned about template specialization (explicit). Seems useful, but not ideal. Note that this serves a different purpose than polymorphism, since it is done at compile-time vs. run-time.
* this template specialization can even be done with traits, and template arguments can have a default argument. (this is how nice things can be done)
* learned that virtual templated functions don't exist (CAN'T exist). However member templated functions are fine. Virtual functions inside templated class are also fine
* heard of the book "Modern C++ design"
* began work towards an implementation of message passing for Factor Graphs. The goal is to create an algorithm that will work independently of wether the is a Factor Graph or a MRF. For that I derived the base class Message Node; I use a virtual send_message. However due to problems much of the implementation is somehow ugly hacked into the visitor, which checks for the type of the node and then does the message passing accordingly.
* created new files "GraphAlgorithms.h" and "GraphAlgorithms.cc"
* some thinking about the implementation of Loopy Belief Propagation; seems not that easy with our current way of doing things
-> finish the implementation of Message passing for Factor graphs (compile; lots of work still)
-> run it, test
-> begin implementing Gibbs sampler for factor graphs.
* this template specialization can even be done with traits, and template arguments can have a default argument. (this is how nice things can be done)
* learned that virtual templated functions don't exist (CAN'T exist). However member templated functions are fine. Virtual functions inside templated class are also fine
* heard of the book "Modern C++ design"
* began work towards an implementation of message passing for Factor Graphs. The goal is to create an algorithm that will work independently of wether the is a Factor Graph or a MRF. For that I derived the base class Message Node; I use a virtual send_message. However due to problems much of the implementation is somehow ugly hacked into the visitor, which checks for the type of the node and then does the message passing accordingly.
* created new files "GraphAlgorithms.h" and "GraphAlgorithms.cc"
* some thinking about the implementation of Loopy Belief Propagation; seems not that easy with our current way of doing things
-> finish the implementation of Message passing for Factor graphs (compile; lots of work still)
-> run it, test
-> begin implementing Gibbs sampler for factor graphs.
Thursday, March 24, 2005
things to do
* send an email for the restaurant (Malek and al.)
* apply for tuition deferall
* do taxes 2005
* check taxes 2004 if check not received at end of April
* update my web site
* write letters to a lot of people
* write emails to a lot of people
* write Jean-Yves Defay about Marina
* apply for tuition deferall
* do taxes 2005
* check taxes 2004 if check not received at end of April
* update my web site
* write letters to a lot of people
* write emails to a lot of people
* write Jean-Yves Defay about Marina
Subscribe to:
Posts (Atom)