Omid, speaking of life, I think we should be after global maximum not minimum! As you know, these days I am a stick in the mud. The good things come and go, while I sit here at my laptop and try to write a few paragraphs of my thesis.
Bill, thanks for dropping by. This Technorati.com is an interesting site. Unfortunately, I cannot afford to pay you to tune my multiobjective GA. I am using MATLAB and its "gatool" for now. Cheers.
Thanks Bill, it's very kind of you. I have not blogged anywhere on what I am doing. As you may have noticed, the current blog is only a reflection of my daily personal life for my friends and family. I am glad that I am finding new contacts.
As for what I am doing: I am working on my MASc thesis in Electrical Engineering at Ryerson Univ, Toronto, ON. My interest is robotics and vision and more specifically, I am working on design issues in robotic visual servoing (feedback of visual information to control a robot).
Somewhere in my analysis, I obtain a 2x3 matrix, a(X). I am interested in the two Singular Values (SVs) of the matrix, and I want to find the X for which the two SVs are equal. So far I am doing good with GA. I have defined a fitness function and found the desired X. Now, suppose that I can pile 3 of these 2x3 matrices onto eachother to form a 6x3 matrix:
A(X,Y,Z)=[a(X);a(Y);a(Z)]
This matrix has a rank of 3 and results in 3 SVs. I still want the 3 SVs to be as close to each other as possible (we know that always s1>s2>s3). Here is where I have problems. Apparently, I am stuck at local minima each time I run the algorithm. I thought of introducing multiobjectives, but I am still studying this. Let me know if this was sufficient, or you need more detailed info to comment on the problem. Cheers.
Three things come to mind, which might be of help:
First, are the Xs in a continuous space? If so, the recombination-biased search operators used by most off-the-shelf genetic algorithms packages might not be appropriate, and you may want to explore a real-value Gaussian mutation search (or even hillclimbing with that operator), or at least swap the normal emphasis away from recombination and towards mutation.
Also, if there are lots of local minima, something like simulated annealing may do better, especially with such a low-dimensional constrained problem. GAs are generally better if the search space is structured in a certain way, and it sounds like SA in this case might speed up the process of converging on a good solution.
You're just trying to find a satisficing solution, right? Not a global optimum?
Actually, the thing I'm wondering is whether you've tried searching in Mathematica, which has a new SVD module that might do the trick deterministically.
[you can email me if this made any sense (or if not) -- no need cluttering up your blog if you don't want!]
Bill, I tried Maple to solve the SVD analytically. For the 2x3 matrix is straightforward and there is no need to use GA or any other optimization methods. However, even for two points, i.e., a 4x3 matrix the analytical solution expands to a few lines and I couldn't correspond the analytical form to the GA local minima. I am working on it for now.
As for the type of X, it is cont. and I used Gaussian mutation (I found this using trial and error with Matlab's gatool). I have a feeling that the search space is structured in a way, so that's why I insisted on using GA. I cannot show it, this is a mere intuition. Besides, I do not have a lower-dim problem. Variable X, Y, Z, ... are points in Cartesian space so each one has three parameters. Given that I need to work with at least 4 points (other constraints of my problem), I would at least have 12 vars to play with. Furthermore, this is the first step in my algorithm. The next step is to combine those A mastrices of multiple instances, say A_1 to A_20 and find the optimized vars using some MOO algorithms, such as Genetic MultiObjective Optimization Algorithm.
Sometimes, I wonder if I am on the right direction. I could be wrong heading towards dark!
If it's still converging prematurely, it may just be that the mutation rate is too low -- or too high. I still would suggest simulated annealing as a useful point to overcome the local minima, since it provides a simpler tunable temperature parameter that can more easily twiddle your search away from the local mins.
13 Comments:
Your algorithms reminds me of my own life. It's stuck in a local minimum too :(
By Anonymous, At 7/20/2005 10:05 AM
[I happened by thanks to a Technorati feed I read...]
I tune evolutionary algorithms for a living (and as my academic research), and I'd be happy to offer what little advice I can, if you want it.
By Anonymous, At 7/20/2005 11:50 AM
Omid, speaking of life, I think we should be after global maximum not minimum! As you know, these days I am a stick in the mud. The good things come and go, while I sit here at my laptop and try to write a few paragraphs of my thesis.
Bill, thanks for dropping by. This Technorati.com is an interesting site. Unfortunately, I cannot afford to pay you to tune my multiobjective GA. I am using MATLAB and its "gatool" for now. Cheers.
By Azad, At 7/20/2005 3:22 PM
No, not for money! It's just an offer to help.
By Anonymous, At 7/20/2005 3:23 PM
Have you blogged anywhere about what you're doing? I'm just curious.
By Anonymous, At 7/20/2005 3:24 PM
Thanks Bill, it's very kind of you. I have not blogged anywhere on what I am doing. As you may have noticed, the current blog is only a reflection of my daily personal life for my friends and family. I am glad that I am finding new contacts.
As for what I am doing: I am working on my MASc thesis in Electrical Engineering at Ryerson Univ, Toronto, ON. My interest is robotics and vision and more specifically, I am working on design issues in robotic visual servoing (feedback of visual information to control a robot).
Somewhere in my analysis, I obtain a 2x3 matrix, a(X). I am interested in the two Singular Values (SVs) of the matrix, and I want to find the X for which the two SVs are equal. So far I am doing good with GA. I have defined a fitness function and found the desired X. Now, suppose that I can pile 3 of these 2x3 matrices onto eachother to form a 6x3 matrix:
A(X,Y,Z)=[a(X);a(Y);a(Z)]
This matrix has a rank of 3 and results in 3 SVs. I still want the 3 SVs to be as close to each other as possible (we know that always s1>s2>s3). Here is where I have problems. Apparently, I am stuck at local minima each time I run the algorithm. I thought of introducing multiobjectives, but I am still studying this. Let me know if this was sufficient, or you need more detailed info to comment on the problem. Cheers.
By Azad, At 7/20/2005 4:00 PM
Hmmmmm,
It made me think....
By Farid, At 7/20/2005 4:05 PM
You have a solution, Farid?
By Azad, At 7/20/2005 4:31 PM
Three things come to mind, which might be of help:
First, are the Xs in a continuous space? If so, the recombination-biased search operators used by most off-the-shelf genetic algorithms packages might not be appropriate, and you may want to explore a real-value Gaussian mutation search (or even hillclimbing with that operator), or at least swap the normal emphasis away from recombination and towards mutation.
Also, if there are lots of local minima, something like simulated annealing may do better, especially with such a low-dimensional constrained problem. GAs are generally better if the search space is structured in a certain way, and it sounds like SA in this case might speed up the process of converging on a good solution.
You're just trying to find a satisficing solution, right? Not a global optimum?
Actually, the thing I'm wondering is whether you've tried searching in Mathematica, which has a new SVD module that might do the trick deterministically.
[you can email me if this made any sense (or if not) -- no need cluttering up your blog if you don't want!]
By Anonymous, At 7/20/2005 7:18 PM
Bill, I tried Maple to solve the SVD analytically. For the 2x3 matrix is straightforward and there is no need to use GA or any other optimization methods. However, even for two points, i.e., a 4x3 matrix the analytical solution expands to a few lines and I couldn't correspond the analytical form to the GA local minima. I am working on it for now.
As for the type of X, it is cont. and I used Gaussian mutation (I found this using trial and error with Matlab's gatool). I have a feeling that the search space is structured in a way, so that's why I insisted on using GA. I cannot show it, this is a mere intuition. Besides, I do not have a lower-dim problem. Variable X, Y, Z, ... are points in Cartesian space so each one has three parameters. Given that I need to work with at least 4 points (other constraints of my problem), I would at least have 12 vars to play with. Furthermore, this is the first step in my algorithm. The next step is to combine those A mastrices of multiple instances, say A_1 to A_20 and find the optimized vars using some MOO algorithms, such as Genetic MultiObjective Optimization Algorithm.
Sometimes, I wonder if I am on the right direction. I could be wrong heading towards dark!
By Azad, At 7/22/2005 1:49 PM
Drop me an email.
If it's still converging prematurely, it may just be that the mutation rate is too low -- or too high. I still would suggest simulated annealing as a useful point to overcome the local minima, since it provides a simpler tunable temperature parameter that can more easily twiddle your search away from the local mins.
By Anonymous, At 7/30/2005 10:21 AM
Well done!
[url=http://ohmhxsft.com/bqzy/gdzt.html]My homepage[/url] | [url=http://idsvpblk.com/rzan/pcfz.html]Cool site[/url]
By Anonymous, At 9/19/2006 8:15 PM
Nice site!
http://ohmhxsft.com/bqzy/gdzt.html | http://nyvwnatr.com/vepm/zpgv.html
By Anonymous, At 9/19/2006 8:15 PM
Post a Comment
Subscribe to Post Comments [Atom]
<< Home