fonction optim()

Postez ici vos questions, réponses, commentaires ou suggestions - Les sujets seront ultérieurement répartis dans les archives par les modérateurs

Modérateur : Groupe des modérateurs

naima oumouhou
Messages : 34
Enregistré le : 04 Mar 2008, 13:39

fonction optim()

Messagepar naima oumouhou » 16 Oct 2008, 15:12

Bonjour, tout le monde,

Est ce quelqu'un sait comment on affiche la valeurs des paramètres à chaque itération pour la fonction optim().Je sais seulement afficher la valeur de la fonction objectif à chaque itération.

Merci bcp d'avance

Matthieu Stigler
Messages : 141
Enregistré le : 07 Sep 2007, 11:30

Messagepar Matthieu Stigler » 22 Oct 2008, 03:51

As-tu essayé avec le paramètre trace en lui donnant une valeur maximum?

naima oumouhou
Messages : 34
Enregistré le : 04 Mar 2008, 13:39

Messagepar naima oumouhou » 22 Oct 2008, 17:08

Bonjour,


j'ai mis une option trace=T mais ça ne donne pas la valeur des paramètres à chaque itération
result<-optim(par=parametreStart,fn=fonctionCalculCase3,control=list(trace=T))

Renaud Lancelot
Messages : 2484
Enregistré le : 16 Déc 2004, 08:01
Contact :

Messagepar Renaud Lancelot » 22 Oct 2008, 17:47

Oui mais bien lire l'aide d'optim et ce qui est dit pour l'argument trace des options:

trace
Non-negative integer. If positive, tracing information on the progress of the optimization is produced. Higher values may produce more tracing information: for method "L-BFGS-B" there are six levels of tracing. (To understand exactly what these do see the source code: higher levels give more detail.)


Application en utilisant un des exemples de l'aide:

Code : Tout sélectionner

> fr <- function(x) {   ## Rosenbrock Banana function
+     x1 <- x[1]
+     x2 <- x[2]
+     100 * (x2 - x1 * x1)^2 + (1 - x1)^2
+ }
> grr <- function(x) { ## Gradient of 'fr'
+     x1 <- x[1]
+     x2 <- x[2]
+     c(-400 * x1 * (x2 - x1 * x1) - 2 * (1 - x1),
+        200 *      (x2 - x1 * x1))
+ }
> optim(c(-1.2, 1), fr, NULL, method = "L-BFGS-B",
+       control = list(trace = 1))
iter    0 value 4.225211
iter   10 value 2.314679
iter   20 value 0.498517
iter   30 value 0.011230
final  value 0.000000
converged
$par
[1] 0.9998000 0.9996001

$value
[1] 3.998487e-08

$counts
function gradient
      49       49

$convergence
[1] 0

$message
[1] "CONVERGENCE: REL_REDUCTION_OF_F <= FACTR*EPSMCH"

>       
> optim(c(-1.2, 1), fr, NULL, method = "L-BFGS-B",
+       control = list(trace = 2))
N = 2, M = 5 machine precision = 2.22045e-16
This problem is unconstrained.

iterations 39
function evaluations 49
segments explored during Cauchy searches 1
BFGS updates skipped 0
active bounds at final generalized Cauchy point 0
norm of the final projected gradient 6.24646e-08
final function value 3.99849e-08

final  value 0.000000
converged
$par
[1] 0.9998000 0.9996001

$value
[1] 3.998487e-08

$counts
function gradient
      49       49

$convergence
[1] 0

$message
[1] "CONVERGENCE: REL_REDUCTION_OF_F <= FACTR*EPSMCH"

>
> optim(c(-1.2, 1), fr, NULL, method = "L-BFGS-B",
+       control = list(trace = 6))
N = 2, M = 5 machine precision = 2.22045e-16
L = -inf -inf
X0 = -1.2 1
U = inf inf
This problem is unconstrained.
At X0, 0 variables are exactly at the bounds
At iterate     0  f=         24.2  |proj g|=        215.6
Iteration     0

---------------- CAUCHY entered-------------------

There are 0  breakpoints

GCP found in this segment
Piece      1 f1, f2 at start point -5.4228e+04  5.4228e+04
Distance to the stationary point =   1.0000e+00
Cauchy X =  214.4 89

---------------- exit CAUCHY----------------------

2  variables are free at GCP on iteration 1
LINE SEARCH 1 times; norm of step = 0.197214
X = -1.01741 1.07453
G = 12.0009 7.88085
Iteration     1
LINE SEARCH 0 times; norm of step = 0.0114189
X = -1.02702 1.06836
G = 1.52837 2.71797
Iteration     2
LINE SEARCH 0 times; norm of step = 0.00337627
X = -1.02854 1.06534
G = -0.989624 1.49137
Iteration     3
LINE SEARCH 0 times; norm of step = 0.00343672
X = -1.02807 1.06194
G = -1.99474 1.00276
Iteration     4
LINE SEARCH 0 times; norm of step = 0.0236104
X = -1.02138 1.03929
G = -5.65025 -0.786711
Iteration     5
LINE SEARCH 0 times; norm of step = 0.0542601
X = -1.00203 0.988602
G = -10.1995 -3.09123
Iteration     6
LINE SEARCH 0 times; norm of step = 0.132557
X = -0.949359 0.866957
G = -16.9342 -6.86521
Iteration     7
LINE SEARCH 0 times; norm of step = 0.139968
X = -0.887445 0.741428
G = -20.1507 -9.2262
Iteration     8
LINE SEARCH 0 times; norm of step = 0.207257
X = -0.783601 0.562063
G = -19.8561 -10.3934
Iteration     9
LINE SEARCH 0 times; norm of step = 0.19843
X = -0.635448 0.430058
G = 3.40452 5.25273
Iteration    10
LINE SEARCH 1 times; norm of step = 0.206745
X = -0.518538 0.259542
G = -4.97436 -1.86782
Iteration    11
LINE SEARCH 1 times; norm of step = 0.100835
X = -0.456905 0.179736
G = -8.21888 -5.80525
Iteration    12
LINE SEARCH 0 times; norm of step = 0.0873079
X = -0.395235 0.117933
G = -8.84224 -7.6557
Iteration    13
LINE SEARCH 0 times; norm of step = 0.127589
X = -0.2913 0.0439307
G = -7.35129 -8.18499
Iteration    14
LINE SEARCH 0 times; norm of step = 0.100561
X = -0.193729 0.0195923
G = -3.77762 -3.58771
Iteration    15
LINE SEARCH 1 times; norm of step = 0.0982764
X = -0.103207 -0.0186719
G = -3.41702 -5.86472
Iteration    16
LINE SEARCH 0 times; norm of step = 0.183555
X = 0.0801619 -0.0269205
G = -0.770399 -6.66928
Iteration    17
LINE SEARCH 0 times; norm of step = 0.100377
X = 0.137746 0.0552961
G = -3.72575 7.26443
Iteration    18
LINE SEARCH 1 times; norm of step = 0.0888163
X = 0.226308 0.0620141
G = -2.52484 2.15977
Iteration    19
LINE SEARCH 2 times; norm of step = 0.0601001
X = 0.285354 0.0732209
G = -0.49254 -1.64119
Iteration    20
LINE SEARCH 0 times; norm of step = 0.0632231
X = 0.345449 0.0928622
G = 2.349 -5.29452
Iteration    21
LINE SEARCH 0 times; norm of step = 0.00931787
X = 0.336789 0.0963011
G = 0.980766 -3.42508
Iteration    22
LINE SEARCH 0 times; norm of step = 0.0394223
X = 0.362639 0.126065
G = -0.485186 -1.0884
Iteration    23
LINE SEARCH 0 times; norm of step = 0.124247
X = 0.458322 0.205324
G = -0.215187 -0.946918
Iteration    24
LINE SEARCH 1 times; norm of step = 0.110134
X = 0.540669 0.278458
G = 2.07999 -2.77289
Iteration    25
LINE SEARCH 0 times; norm of step = 0.163668
X = 0.650753 0.399572
G = 5.52495 -4.78153
Iteration    26
LINE SEARCH 0 times; norm of step = 0.0559231
X = 0.672376 0.451146
G = -0.401124 -0.188775
Iteration    27
LINE SEARCH 0 times; norm of step = 0.144842
X = 0.759149 0.567118
G = 2.30898 -1.83783
Iteration    28
LINE SEARCH 0 times; norm of step = 0.114746
X = 0.822201 0.662989
G = 3.92858 -2.60511
Iteration    29
LINE SEARCH 0 times; norm of step = 0.0777121
X = 0.856502 0.732721
G = 0.0127702 -0.174795
Iteration    30
LINE SEARCH 0 times; norm of step = 0.131
X = 0.923403 0.84535
G = 2.55204 -1.46462
Iteration    31
LINE SEARCH 0 times; norm of step = 0.0159
X = 0.928737 0.860329
G = 0.683602 -0.44456
Iteration    32
LINE SEARCH 0 times; norm of step = 0.0875154
X = 0.969425 0.937811
G = 0.704454 -0.394676
Iteration    33
LINE SEARCH 0 times; norm of step = 0.0355232
X = 0.985004 0.969735
G = 0.166934 -0.0997614
Iteration    34
LINE SEARCH 0 times; norm of step = 0.0283741
X = 0.99781 0.995055
G = 0.223439 -0.113959
Iteration    35
LINE SEARCH 1 times; norm of step = 0.00175543
X = 0.998353 0.996725
G = -0.00915797 0.00313689
Iteration    36
LINE SEARCH 0 times; norm of step = 0.00298218
X = 0.999695 0.999388
G = 0.000683926 -0.000447215
Iteration    37
LINE SEARCH 0 times; norm of step = 0.000236388
X = 0.9998 0.9996
G = 6.79018e-05 -3.4142e-05
Iteration    38
LINE SEARCH 0 times; norm of step = 5.6501e-07
X = 0.9998 0.9996
G = -6.24646e-08 2.90795e-08

iterations 39
function evaluations 49
segments explored during Cauchy searches 1
BFGS updates skipped 0
active bounds at final generalized Cauchy point 0
norm of the final projected gradient 6.24646e-08
final function value 3.99849e-08

X = 0.9998 0.9996
F = 3.99849e-08
final  value 0.000000
converged
$par
[1] 0.9998000 0.9996001

$value
[1] 3.998487e-08

$counts
function gradient
      49       49

$convergence
[1] 0

$message
[1] "CONVERGENCE: REL_REDUCTION_OF_F <= FACTR*EPSMCH"


Renaud

naima oumouhou
Messages : 34
Enregistré le : 04 Mar 2008, 13:39

Messagepar naima oumouhou » 24 Oct 2008, 08:24

j'ai regardé l'aide.
Mais je ne sais pas la valeur à mettre pour trace quand j'utilise la méthode BFGS.
Pour la méthode "L-BFGS-B", il faut mettre trace=6 pour avoir la valeur des paramètres à chaque itération mais pour BFGS, je ne sais pas.

J'ai essayé avec trace=6, trace=2 mais je n'obtiens pas de résultat aussi détaillé.
result<-optim(par=parametreStart,fn=fonctionCalculCase3,method="BFGS",control=list(trace=6))


Retourner vers « Questions en cours »

Qui est en ligne

Utilisateurs parcourant ce forum : Google [Bot] et 1 invité