MGO alternatives and similar packages
Based on the "Science and Data Analysis" category.
Alternatively, view MGO alternatives based on common mentions on social networks and blogs.
-
Zeppelin
Web-based notebook that enables data-driven, interactive data analytics and collaborative documents with SQL, Scala and more. -
BigDL
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Mixtral, Gemma, Phi, MiniCPM, Qwen-VL, MiniCPM-V, etc.) on Intel XPU (e.g., local PC with iGPU and NPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, vLLM, GraphRAG, DeepSpeed, Axolotl, etc -
FACTORIE
FACTORIE is a toolkit for deployable probabilistic modeling, implemented as a software library in Scala. It provides its users with a succinct language for creating relational factor graphs, estimating parameters and performing inference. -
ND4S
DISCONTINUED. ND4S: N-Dimensional Arrays for Scala. Scientific Computing a la Numpy. Based on ND4J. -
Clustering4Ever
C4E, a JVM friendly library written in Scala for both local and distributed (Spark) Clustering. -
rscala
The Scala interpreter is embedded in R and callbacks to R from the embedded interpreter are supported. Conversely, the R interpreter is embedded in Scala.
CodeRabbit: AI Code Reviews for Developers
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of MGO or a related project?
Popular Comparisons
README
MGO
MGO is a purely functionnal scala library based for evolutionary / genetic algorithms:
- enforcing immutability,
- exposes a modular and extensible architecture,
- implements state of the art algorithms,
- handles noisy (stochastic) fitness functions,
- implements auto-adaptatative algortihms,
- implements algorithms with distributed computing in mind for integration with OpenMOLE.
MGO implements NGSAII, NSGA3, CP (Calibration Profile), PSE (Pattern Search Experiment), OSE (Antecedant research), Niched Evolution, ABC (Bayesian Calibration).
Licence
MGO is licenced under the GNU GPLv3 software licence.
Example
Define a problem, for instance the multi-modal multi-objective ZDT4 benchmark:
object zdt4 {
def continuous(size: Int) = Vector.fill(size)(C(0.0, 5.0))
def compute(genome: Vector[Double], d: Vector[Int]): Vector[Double] = {
val genomeSize = genome.size
def g(x: Seq[Double]) = 1 + 10 * (genomeSize - 1) + x.map { i => pow(i, 2) - 10 * cos(4 * Pi * i) }.sum
def f(x: Seq[Double]) = {
val gx = g(x)
gx * (1 - sqrt(genome(0) / gx))
}
Vector(genome(0), f(genome.tail))
}
}
Define the optimisation algorithm, for instance NSGAII:
import mgo.evolution._
import mgo.evolution.algorithm._
// For zdt4
import mgo.test._
val nsga2 =
NSGA2(
mu = 100,
lambda = 100,
fitness = zdt4.compute,
continuous = zdt4.continuous(10))
Run the optimisation:
def evolution =
nsga2.
until(afterGeneration(1000)).
trace((s, is) => println(s.generation))
val (finalState, finalPopulation) = evolution.eval(new util.Random(42))
println(NSGA2.result(nsga2, finalPopulation).mkString("\n"))
Noisy fitness functions
All algorithm in MGO have version to compute on noisy fitness function. MGO handle noisy fitness functions by resampling only the most promising individuals. It uses an aggregation function to aggregate the multiple sample when needed.
For instance a version of NSGA2 for noisy fitness functions may be used has follow:
import mgo._
import algorithm.noisynsga2._
import context.implicits._
object sphere {
def scale(s: Vector[Double]): Vector[Double] = s.map(_.scale(-2, 2))
def compute(i: Vector[Double]): Double = i.map(x => x * x).sum
}
object noisySphere {
def scale(s: Vector[Double]): Vector[Double] = sphere.scale(s)
def compute(rng: util.Random, v: Vector[Double]) =
sphere.compute(v) + rng.nextGaussian() * 0.5 * math.sqrt(sphere.compute(v))
}
def aggregation(history: Vector[Vector[Double]]) = history.transpose.map { o => o.sum / o.size }
val nsga2 =
NoisyNSGA2(
mu = 100,
lambda = 100,
fitness = (rng, v) => Vector(noisySphere.compute(rng, v)),
aggregation = aggregation,
genomeSize = 2)
val (finalState, finalPopulation) =
run(nsga2).
until(afterGeneration(1000)).
trace((s, is) => println(s.generation)).
eval(new util.Random(42))
println(result(finalPopulation, aggregation, noisySphere.scale).mkString("\n"))
Diversity only
MGO proposes the PSE alorithm that aim a creating diverse solution instead of optimsing a function. The paper about this algorithm can be found here.
import mgo._
import algorithm.pse._
import context.implicits._
val pse = PSE(
lambda = 10,
phenotype = zdt4.compute,
pattern =
boundedGrid(
lowBound = Vector(0.0, 0.0),
highBound = Vector(1.0, 200.0),
definition = Vector(10, 10)),
genomeSize = 10)
val (finalState, finalPopulation) =
run(pse).
until(afterGeneration(1000)).
trace((s, is) => println(s.generation)).
eval(new util.Random(42))
println(result(finalPopulation, zdt4.scale).mkString("\n"))
This program explores all the different combination of values that can be produced by the multi-objective function of ZDT4.
For more examples, have a look at the main/scala/fr/iscpif/mgo/test directory in the repository.
Mixed optimisation and diversity
The calibration profile algorthim compute the best fitness function for a set of niches. This algorithm is explained here.
In MGO you can compute profiles of a 10 dimensional hyper-sphere function using the following:
import algorithm.profile._
import context.implicits._
//Profile the first dimension of the genome
val algo = Profile(
lambda = 100,
fitness = sphere.compute,
niche = genomeProfile(x = 0, nX = 10),
genomeSize = 10)
val (finalState, finalPopulation) =
run(algo).
until(afterGeneration(1000)).
trace((s, is) => println(s.generation)).
eval(new util.Random(42))
println(result(finalPopulation, sphere.scale).mkString("\n"))
Noisy profiles
All algorithms in MGO have a pendant for noisy fitness function. Here is an example of a profile computation for a sphere function with noise.
import algorithm.noisyprofile._
import context.implicits._
def aggregation(history: Vector[Double]) = history.sum / history.size
def niche = genomeProfile(x = 0, nX = 10)
val algo = NoisyProfile(
muByNiche = 20,
lambda = 100,
fitness = noisySphere.compute,
aggregation = aggregation,
niche = niche,
genomeSize = 5)
val (finalState, finalPopulation) =
run(algo).
until(afterGeneration(1000)).
trace((s, is) => println(s.generation)).
eval(new util.Random(42))
println(result(finalPopulation, aggregation, noisySphere.scale, niche).mkString("\n"))
Distributed computing
Algorithms implemented in MGO are also avialiable in the workflow plateform for distributed computing OpenMOLE.
SBT dependency
libraryDependencies += "fr.iscpif" %% "mgo" % "2.45"