Skip to content

Optimization interval hard coded in maximize_posterior function #43

@dfsp-spirit

Description

@dfsp-spirit

The maximize_posterior function in main/adaptivetesting/math/estimators/__functions/__bayes.py seems to ignore the optimization_interval parameter one can pass to the TestAssembler constructor. E.g.,

est_args =  {
            "prior": NormalPrior(0.0, 1.0), # Use a normal distribution with mean 0 and standard deviation 1 as prior.
            "optimization_interval": (-4, 4),
        }

adaptive_test: AdaptiveTest = TestAssembler(
            item_pool=item_pool,
            simulation_id=f"{user_id}",
            participant_id=f"user_{user_id}",
            ability_estimator=BayesModal,
            estimator_args=est_args,         # <======== here we pass them...
            item_selector=maximum_information_criterion,
            simulation=True,
            debug=debug,
            true_ability_level=theta
        )

... but then, maximize_posterior() does not use them, it has hard coded values (-10, 10), see:

https://github.com/condecon/adaptivetesting/blob/main/adaptivetesting/math/estimators/__functions/__bayes.py#L38

Metadata

Metadata

Assignees

Labels

No labels
No labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions