This file demonstrates fitting three types of discounting models to one group of subjects. We begin by loading the necessary R libraries.

The Basics - Loading data, summarizing, plotting, and basic nonlinear fit

library(nlme)     # loads the library used for nonlinear multilevel modeling commands
library(lattice)  # loads the library that includes xyplot

R will look for the data in the working directory. If your file isn’t there, you either need to move it there (use “getwd()” to find out what it is) or change the working directory (it’s in your menu)

I’m assuming that your data is in the long format, in comma-delimited form, and has no missing values.
If you have missing values, you may need to use na.strings in the read.csv call.

Read the file, look at a summary of the data, plot the individual subject data.

myd<-read.csv("Example one group data.dat") 
summary(myd)
##     Subject          Delay           Value    
##  Min.   : 1.00   Min.   :  3.0   Min.   :  3  
##  1st Qu.:22.00   1st Qu.: 18.0   1st Qu.:102  
##  Median :43.00   Median : 72.0   Median :242  
##  Mean   :44.13   Mean   :117.5   Mean   :258  
##  3rd Qu.:66.00   3rd Qu.:180.0   3rd Qu.:445  
##  Max.   :89.00   Max.   :360.0   Max.   :492

For the plot command below, the use of as.factor treats the variable as categorical; the “|” creates separate plots for each subject; type=“l” means that it plot a connected line.

xyplot(Value~Delay | as.factor(Subject), data=myd, type="l") 

We’ll use a simple nls to estimate a good starting value for logk This should not be used for repeated measures analysis because nls assumes each measure is independent.

simp<-nls(Value~500/(1+exp(logk)*Delay), data=myd, start=c(logk=-2))
summary(simp) # summarizes the model we just ran; I'll use -4 as the starting value in the multilevel model
## 
## Formula: Value ~ 500/(1 + exp(logk) * Delay)
## 
## Parameters:
##      Estimate Std. Error t value Pr(>|t|)    
## logk -4.08510    0.08419  -48.52   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 169.7 on 509 degrees of freedom
## 
## Number of iterations to convergence: 10 
## Achieved convergence tolerance: 7.637e-06

Multilevel modeling

Let’s dive right in and fit a multilevel model hyperbolic to the data. Note that the immediate amount was 500.

myr<-nlme(Value~500/(1+exp(logk)*Delay), fixed=logk~1, random=logk~1|Subject, data=myd, start=c(logk=-4))

Note, if you have convergence issues, you might try the following use of ‘tol’ and choose larger values

myr<-nlme(Value~500/(1+exp(logk)*Delay), fixed=logk~1, random=logk~1|Subject, data=myd, start=c(logk=-4), control=c(nlmeControl(tol=.1)))

Let’s look at the results multiple ways.

summary(myr)
## Nonlinear mixed-effects model fit by maximum likelihood
##   Model: Value ~ 500/(1 + exp(logk) * Delay) 
##  Data: myd 
##        AIC      BIC    logLik
##   6197.683 6210.386 -3095.842
## 
## Random effects:
##  Formula: logk ~ 1 | Subject
##             logk Residual
## StdDev: 2.058452 81.64945
## 
## Fixed effects: logk ~ 1 
##          Value Std.Error  DF   t-value p-value
## logk -4.024759 0.2308113 425 -17.43744       0
## 
## Standardized Within-Group Residuals:
##         Min          Q1         Med          Q3         Max 
## -2.90685037 -0.34922262  0.02338134  0.47971535  4.39598302 
## 
## Number of Observations: 510
## Number of Groups: 85
coef(myr)                     # This shows the best fitting logk for each subject
##          logk
## 1   0.2583167
## 2  -5.0013646
## 3  -1.7740583
## 4  -7.4830493
## 5  -5.2774513
## 6  -6.1501888
## 7  -1.9628801
## 8  -4.2130128
## 9  -2.2402013
## 10 -3.3357858
## 11 -7.6499880
## 12 -1.3630143
## 13 -3.3242617
## 14 -2.2774708
## 15 -6.0526142
## 16 -6.6786813
## 17 -2.7045613
## 18 -1.0425035
## 19 -1.9679254
## 20 -3.2879761
## 21 -6.1381723
## 22 -2.2184014
## 23 -5.9689411
## 24 -5.0013646
## 25 -5.7618720
## 26 -4.2672961
## 27 -6.4881286
## 28 -0.5012084
## 29 -2.6295417
## 30 -3.5823611
## 31 -2.4187947
## 32 -6.4211386
## 33 -4.5402362
## 34 -7.6176150
## 35 -3.5253807
## 36 -2.5234560
## 37 -2.5592035
## 38 -7.4830493
## 39 -3.7138254
## 40 -3.8093491
## 41 -1.3314481
## 42 -5.0358334
## 43 -4.0772990
## 44 -3.9977956
## 45 -5.0790149
## 46 -3.2227304
## 47 -3.8567054
## 49 -4.2161699
## 50 -7.3004058
## 51 -6.1947059
## 52 -5.0696320
## 53 -2.2171271
## 55 -2.0007432
## 56 -7.6499880
## 57 -4.7319720
## 58 -7.6499880
## 59 -4.1041443
## 60 -4.7823267
## 61 -1.1951539
## 62 -2.8507873
## 63 -6.7025872
## 64  0.2489106
## 65 -3.6344933
## 66 -3.4271654
## 67 -7.6381883
## 68 -4.3853778
## 69 -4.3657286
## 70 -2.7753457
## 71 -4.1474123
## 72 -0.8554330
## 74 -4.5210751
## 75 -5.1236530
## 76 -2.8742796
## 77 -4.9742463
## 78 -5.2496890
## 80 -1.6930857
## 81 -7.6499880
## 82 -2.3302751
## 83 -2.1633561
## 84 -1.7656673
## 85 -5.7309051
## 86 -3.0690636
## 87 -2.8584003
## 88 -3.0871187
## 89 -4.0689088
hist(coef(myr)$logk)          # This plots a histogram of those values

hist(resid(myr))              # This plots a histogram of the residuals; should be normal-ish

Now, let’s look at our fits superimposed on the data Note that xyplot won’t handle missing data well, so if data are missing, you’ll need to exclude them using something like “data=subset(myd, !is.na(Value))” in place of “data=myd”. “!” is NOT

xyplot(fitted(myr)+Value~Delay|as.factor(Subject), data=myd, type=c("a", "p"), distribute.type=T)

Hyperboloids - two-parameter discounting functions

Okay, let’s try the Green-Myerson and Rachlin models and allow k and s to vary across subjects

myrGM<-nlme(Value~500/(1+exp(logk)*Delay)^s, fixed=logk+s~1, random=logk+s~1|Subject, data=myd, start=c(logk=-4, s=1))
myrRach<-nlme(Value~500/(1+exp(logk)*Delay^s), fixed=logk+s~1, random=logk+s~1|Subject, data=myd, start=c(logk=-4, s=1))

You can look at summaries of both using the “summary” command. I’m going to look at the AIC for each of them and then plot all three using different line types (“lty”).

AIC(myr, myrGM, myrRach)  # Remember, lower is better.
##         df      AIC
## myr      3 6197.683
## myrGM    6 6591.800
## myrRach  6 6111.179
xyplot(fitted(myr)+fitted(myrGM)+fitted(myrRach)+Value~Delay|as.factor(Subject), data=myd, type=c("a", "a", "a", "p"), distribute.type=T, lty=c(1, 2, 3, 4), auto.key=T)

(The latter was the case in R v. 3.3). When you look at the coef of myrGM, you’ll discover a problem. We could try different starting values, but I tried that and it didn’t help. Instead, I didn’t let s vary across subjects and got a much better fit. This was accomplished by removing s from the random effect specification.

Note: The fit of myrGM in version 3.5 is automatically holding the s parameter to a relatively constant value, evidence that they’ve improved the algorithm. Regardless, I’m going to proceed as I did before because the fit is mildly better when purposely held constant.

myrGM<-nlme(Value~500/(1+exp(logk)*Delay)^s, fixed=logk+s~1, random=logk~1|Subject, data=myd, start=c(logk=-4, s=1))

AIC(myr, myrGM, myrRach)  # Little change in myrGM under version 3.5 of R.
##         df      AIC
## myr      3 6197.683
## myrGM    4 6148.259
## myrRach  6 6111.179
xyplot(fitted(myr)+fitted(myrGM)+fitted(myrRach)+Value~Delay|as.factor(Subject), data=myd, type=c("a", "a", "a", "p"), distribute.type=T, lty=c(1, 2, 3, 4), auto.key=T) 

Model Comparison

Let’s visually compare the fits; note how well-behaved Green-Myerson is if s isn’t allowed to vary.

par(mfrow=c(1,2))
plot(coef(myrGM)$logk~coef(myr)$logk, ylab="GM Hyperboloid log(k)", xlab="Hyperbolic log(k)", ylim=c(-7,3))
plot(function(x) x, -10,10, add=T, lty=2)
plot(coef(myrRach)$logk~coef(myr)$logk, ylab="Rachlin Hyperboloid log(k)", xlab="Hyperbolic log(k)", ylim=c(-7,3))
plot(function(x) x, -10,10, add=T, lty=2)

The following does the same plot but with the size of the circle indicating the value of s. Note that the Green-Myerson k values are slightly higher than those for a model without the (constant) s parameter. This occurs because an s less than 1 flattens the curve so k was increased to compensate.

symbols(coef(myr)$logk, coef(myrGM)$logk,circles=coef(myrGM)$s, inches=1/10, ylab="GM Hyperboloid log(k)", xlab="Hyperbolic log(k)", ylim=c(-7,3))
plot(function(x) x, -10,10, add=T, lty=2)

# I had to add .25 to the circle size for Rachlin because this fit produced a negative s for one subject
symbols(coef(myr)$logk, coef(myrRach)$logk,circles=coef(myrRach)$s+.25, inches=1/10, ylab="Rachlin Hyperboloid log(k)", xlab="Hyperbolic log(k)", ylim=c(-7,3))
plot(function(x) x, -10,10, add=T, lty=2)

# (For R Version 3.3) BTW, after a careful search, I could get GM to fit with a starting value for k of 3.7 or 3.8, but the fit was worse than when s was fixed. I could also get variation when k started at 1.4, s was .5 and I relaxed the error tolerance. The fit was good, but s was mimicking k. Bottom line is that Green-Myerson was proving hard to fit for this data set when k and s both allowed to vary.*

Fitting subjects individually without influence from the group

To do individual fits using nlsList, see the following example for the hyperbolic; it’s not recommended for hyperboloids.

myrList<-nlsList(Value~500/(1+exp(logk)*Delay) | Subject, data=myd, start=c(logk=-4))

summary(myrList)    # shows the individual estimates, the SE, t, and p-value
## Call:
##   Model: Value ~ 500/(1 + exp(logk) * Delay) | Subject 
##    Data: myd 
## 
## Coefficients:
##    logk 
##       Estimate Std. Error      t value     Pr(>|t|)
## 1   2.16797343  4.4882068   0.48303777 8.425583e-01
## 2  -5.06323520  0.3653517 -13.85852487 1.369524e-03
## 3  -1.66533161  0.5134840  -3.24320032 1.781395e-03
## 4  -8.57707228  2.2955113  -3.73645396 4.528418e-07
## 5  -5.31616600  0.3730427 -14.25082478 5.853465e-07
## 6  -6.30842920  0.4779043 -13.20019312 8.408679e-04
## 7  -1.80092768  0.5035351  -3.57656834 9.275050e-03
## 8  -4.21908743  0.3679260 -11.46721823 2.258038e-05
## 9  -2.16738463  0.4745463  -4.56727751 1.499248e-03
## 10 -3.31135017  0.3941888  -8.40041707 1.860494e-04
## 11 -9.62353669  6.0885983  -1.58058328 5.515320e-06
## 12 -1.17212312  0.5529998  -2.11957229 5.507459e-03
## 13 -3.29425711  0.3949473  -8.34100463 1.214299e-04
## 14 -2.19383375  0.4723364  -4.64464225 8.664893e-04
## 15 -6.15423768  0.4510391 -13.64457685 1.279133e-05
## 16 -6.90682398  0.6412509 -10.77086026 8.198125e-07
## 17 -2.64521485  0.4349931  -6.08105063 9.398802e-05
## 18 -0.83236392  0.5947616  -1.39949168 7.993872e-03
## 19 -1.85978866  0.4991113  -3.72620040 1.989130e-03
## 20 -3.25901254  0.3965558  -8.21829455 1.300731e-06
## 21 -6.23693149  0.4648235 -13.41784815 4.109526e-06
## 22 -2.13037372  0.4776196  -4.46039818 4.498362e-02
## 23 -6.06015821  0.4369842 -13.86814071 5.830094e-07
## 24 -5.06323520  0.3653517 -13.85852487 1.369524e-03
## 25 -5.83346356  0.4094372 -14.24751722 6.559234e-07
## 26 -4.27504032  0.3669349 -11.65067824 9.262681e-05
## 27 -6.67766776  0.5656913 -11.80443883 5.938610e-05
## 28 -0.07105473  0.7960559  -0.08925847 8.124370e-01
## 29 -2.56716630  0.4411678  -5.81902508 2.087807e-03
## 30 -3.56671973  0.3843669  -9.27946647 7.898980e-06
## 31 -2.35323313  0.4588961  -5.12803065 1.190802e-03
## 32 -6.58344575  0.5397070 -12.19818439 1.373258e-06
## 33 -4.55690758  0.3632287 -12.54556045 6.956399e-07
## 34 -9.35744383  4.7223085  -1.98154015 2.099926e-05
## 35 -3.50496375  0.3865042  -9.06837273 2.955517e-05
## 36 -2.46012390  0.4499314  -5.46777602 2.512056e-03
## 37 -2.49384141  0.4471409  -5.57730559 1.269905e-04
## 38 -8.57707228  2.2955113  -3.73645396 4.528418e-07
## 39 -3.70005118  0.3801813  -9.73233324 1.216910e-04
## 40 -3.80157892  0.3773295 -10.07495712 1.801562e-03
## 41 -1.11289017  0.5589991  -1.99086228 1.932203e-03
## 42 -5.09061227  0.3659290 -13.91147444 1.874553e-04
## 43 -4.07990211  0.3706785 -11.00657815 1.161393e-03
## 44 -3.99687875  0.3725012 -10.72984206 3.761183e-06
## 45 -5.11578133  0.3665110 -13.95805673 1.186122e-06
## 46 -3.18947017  0.3999117  -7.97543668 2.746479e-05
## 47 -3.84972264  0.3760643 -10.23687440 6.224982e-05
## 49 -4.23488201  0.3676391 -11.51912923 1.791132e-03
## 50 -7.99817302  1.3953925  -5.73184447 8.676219e-07
## 51 -6.33288211  0.4826393 -13.12135656 4.187616e-05
## 52 -5.10777526  0.3663205 -13.94346080 1.856298e-06
## 53 -2.13288062  0.4774122  -4.46758682 3.178017e-03
## 55 -1.90059974  0.4959937  -3.83190329 5.369954e-04
## 56 -9.62353669  6.0885983  -1.58058328 5.515320e-06
## 57 -4.75270531  0.3623798 -13.11525846 8.479679e-07
## 58 -9.62353669  6.0885983  -1.58058328 5.515320e-06
## 59 -4.10656709  0.3701212 -11.09519434 1.804390e-05
## 60 -4.81008119  0.3624850 -13.26973795 1.039556e-04
## 61 -1.00412155  0.5712742  -1.75768745 7.550342e-04
## 62 -2.80713192  0.4229725  -6.63667700 3.433200e-04
## 63 -6.94642495  0.6563007 -10.58421156 9.822706e-06
## 64  1.80753398  3.2270489   0.56011981 9.281545e-03
## 65 -3.62205442  0.3825630  -9.46786373 4.275581e-04
## 66 -3.37189463  0.3916117  -8.61030000 4.203286e-03
## 67 -9.52328681  5.5307374  -1.72188375 5.715041e-04
## 68 -4.39636899  0.3650493 -12.04322033 3.819116e-05
## 69 -4.37903915  0.3652945 -11.98769473 2.550775e-04
## 70 -2.72257604  0.4291035  -6.34479967 5.165430e-04
## 71 -4.15488996  0.3691469 -11.25538476 3.361460e-04
## 72  0.02832963  0.8389505   0.03376794 9.875663e-01
## 74 -4.54092725  0.3633698 -12.49671074 1.590258e-03
## 75 -5.16842551  0.3678924 -14.04874358 2.074537e-04
## 76 -2.81927589  0.4221204  -6.67884293 1.220455e-04
## 77 -5.00719399  0.3643442 -13.74303078 7.942095e-05
## 78 -5.29742420  0.3722795 -14.22969786 4.892650e-05
## 80 -1.47798037  0.5272160  -2.80336805 2.073971e-02
## 81 -9.62353669  6.0885983  -1.58058328 5.515320e-06
## 82 -2.25677549  0.4670441  -4.83203963 4.300321e-03
## 83 -2.06874562  0.4826770  -4.28598356 1.877323e-03
## 84 -1.63403783  0.5157558  -3.16823956 1.421809e-04
## 85 -5.81312956  0.4073612 -14.27020936 4.042925e-05
## 86 -3.03929013  0.4080263  -7.44875977 2.944036e-03
## 87 -2.80163200  0.4233608  -6.61759833 3.749210e-05
## 88 -3.02766200  0.4087056  -7.40792869 2.271827e-03
## 89 -4.07107643  0.3708660 -10.97721678 9.828506e-04
## 
## Residual standard error: 81.01023 on 425 degrees of freedom
sapply(myrList,AIC) # if you want the individual AIC values; not surprisngly, Subject 1 had the worst fit.
##        1        2        3        4        5        6        7        8 
## 82.71066 81.91843 65.19020 46.38619 63.06880 80.05513 71.00362 69.31237 
##        9       10       11       12       13       14       15       16 
## 68.83863 70.78858 42.10117 63.20185 69.63795 67.59901 70.01197 60.52134 
##       17       18       19       20       21       22       23       24 
## 65.20948 59.30258 67.15218 58.38935 67.05355 78.87952 62.73251 81.91843 
##       25       26       27       28       29       30       31       32 
## 63.34042 72.97555 72.03378 60.30277 72.63154 64.21312 69.61917 63.25943 
##       33       34       35       36       37       38       39       40 
## 61.95556 48.06753 67.15468 72.38568 64.91969 46.38619 71.49454 78.82225 
##       41       42       43       44       45       46       47       49 
## 59.55225 76.86064 78.71863 64.15634 64.52303 65.43392 70.44020 80.41418 
##       50       51       52       53       55       56       57       58 
## 53.08836 72.44300 65.59209 70.60578 64.05479 42.10117 62.96596 42.10117 
##       59       60       61       62       63       64       65       66 
## 68.36878 74.82332 55.58088 69.50485 66.32159 48.75761 74.32643 79.25705 
##       67       68       69       70       71       72       74       75 
## 54.61538 71.18787 75.84883 70.00652 75.79012 81.35112 81.07398 77.23275 
##       76       77       78       80       81       82       83       84 
## 66.98367 74.57656 73.79874 70.59622 42.10117 72.38888 68.67626 58.41497 
##       85       86       87       88       89 
## 73.36380 76.52967 63.95728 75.75687 78.24895

As an aside, the exp(logk) trick wasn’t necessary for the last analysis - you could estimate k directly for nlsList.

  • Updated: 9/22/23