0 of 0

File information

Last updated

Original upload

Created by

herbert100

Uploaded by

herbert100

Virus scan

Safe to use

About this mod

Lua library that includes a new logger, a way to automate the creation of Classes (supporting multiple inheritence, instance checking, and some dataclass-like functionality), and a way to generate polynomial splines. Requires MWSE-lua, but several parts of this should be compatible with OpenMW-lua.

Requirements
Permissions and credits
Changelogs
Requires MWSE

Features

Note:
Nothing in this mod will do anything by itself. It's intended to be a modders resource. This is a requirement for some of my other mods.


New Logger

Want to start logging? Type

local log = Herbert_Logger.new()

You can write debug/info/etc messages by typing log:info(message, ...), where the additional arguments get sent to `string.format`. Log messages are only shown at the appropriate log level.

Want to add log settings to your MCM? Type

log:add_to_MCM{component = my_page, config = my_config}

Nothing else is required, you can now start using the logger. (The add_to_MCM method will also update your current logging level when the MCM is created, so you don't have to set it manually. The logging level is also updated whenever the MCM setting is changed.)

Log messages are printed as

[<MOD_NAME> | <FILE_PATH>:<LINE_NUMBER | <LOG_LEVEL> | <TIMESTAMP>] <MESSAGE>

The <MOD_NAME> is taken from your mods metadata file, if it has one; otherwise, it's the mods root folder: e.g., "mods/herbert100/more quickloot", would get <MOD_NAME> = "more quickloot"

The <FILE_PATH> is taken relative to your mods "root" folder. So, "mods/herbert100/more quickloot/managers/pickpocket.lua" would be written as "managers/pickpocket.lua".

Here's an example of what this looks like in practice: the line

log("found tool")

written at line 117 of the "mods/herbert100/quick select/get_options.lua" file would be printed as

[Quick Select Menu | get_options.lua:117 | DEBUG | 02:31.580] found tool




Here are some functionalities of this logger:
  • Your mod name will be automatically retrieved when creating new loggers. No need to specify it. 
  • Each file gets its own logger, to make tracing back log messages easier. 
  • Updating the settings of one logger will automatically update the settings of all other loggers registered to your mod
  • Line numbers are included in output.
  • Adding loggers to MCM is done automatically with a convenience function. This will also set the logging level whenever the game finishes initializing, using your current config settings.



Object Oriented Programming support (via Class.lua)

Features:

  • Multiple inheritance: you can specify any number of parent classes. Child classes will inherit fields, methods, and metamethods from parent classes in a depth-first manner.
  • Easy instance checking: you can use the "is_instance_of" method to check if an object is an instance of another class. This will also check all parent classes, in constant time. This will return false if called on a class instead of an object.
  • Customizable object creation: you can specify "init" and "post_init" methods, giving you control over how new objects should be initialized, and what should happen after they're initialized. You can also customize whether objects should be created by passing in a table, or multiple parameters, or both.
  • Specify fields and automate the creation of metamethods. You can specify a list of fields that each class should have. These fields support inheritance in all the ways you would expect. The following parameters can be specified for each field, and they are all optional:
  • default: this lets you specify the default value for this field.
  • converter: this should be a function that takes in one value and returns another. Example: setting "converter=tonumber" will ensure the field is converted to a number during object creation.
  • factory: this allows you to generate default values for objects if they weren't provided. Example: setting factory = function() return {} end will generate a new table on object creation, provided the corresponding field was nil when the object was created. In general, a factory can be any function that takes in the object being created and returns a value to use for that field.
  • comp: this should be either "true", "false", or a function. If true, this field will be included in premade "<" and "<=" metamethods. It can also be a function that takes in a value and returns a value that should be used in comparisons. Example: "comp=tonumber" will let strings be compared as if they were numbers.
  • eq: this works exactly like "comp", except that it's for prebuilt "==" metamethods. Note: at least one field must have "eq=true" or "eq= a function" in order for the "==" metamethod to be created. Otherwise, the default "==" operator will be used.
  • tostring: this can be either "true", "false", or a function. If true, the field will be included in prebuilt "tostring" metamethods. If a function, it should take in a value and return a value to use in the tostring metamethod. Example: If your class has a field that stores a "tes3.skill", you can write
    tostring = function(v) return table.find(tes3.skill,v) end
    This will ensure that the "skill" field will be printed using the actual name of the skill (instead of just the numeric id). 

Note:
fields are passed in order, and the generated "comp" and "eq" metamethods will compare fields in the order they are defined. This makes it very easy to automatically generate a metamethod that compares objects first by one field, and then use subsequent fields as tie-breakers. This also means that "tostring" will print fields in the order they are defined.

Polynomial Splines

This lets you approximate a hard to compute (numerical) function with a much "simpler" approximation. Depending on how complicated the original function is, a polynomial spline can be orders of magnitude faster.

New splines can be created by passing in a function, and then specifying:
  • lower bound: the minimum value to approximate the function at. When evaluating the spline on any number smaller than this, the original function will be used.
  • upper bound: the minimum value to approximate the function at.
  • distance between points. higher values will result in more accurate approximations, but use up more memory. usually, something like "15" or "20" is enough for accuracy to 4 or 5 decimal places.



Verbose description of Object-oriented programming support

This mod makes object oriented programming easier by adding a robust method of creating classes, inspired by (but that still pales in comparison to) Python's dataclasses and attrs. Here are the features:

Note: In some lines of code that will be presented here, the parameters/fields are written as `["parameter"]` instead of simply `parameter`. This isn't necessary when using the mod, I only wrote them this way because of a website bug (or more likely it's a feature I don't understand). After previewing the description, certain sections were getting replaced with the text "javascript-event-disabled" or something similar.

Allow classes and objects are allowed to behave differently


This is the main thing that inspired me to make my own system for creating new classes. The method of creating classes suggested by the online guide really blurs the line between classes and objects. (Following the instructions in the online guide, you create subclasses the same way you create objects, and it's not possible to define different metamethods for classes and objects (or at least, it wasn't clear to me).)

This is not the case here!

When creating a new class, it is now possible to specify different metamethods for classes and objects.
  • Object metamethods are defined in the "obj_metamethods" parameter passed to the Class constructor.
  • Class metamethods are defined in the "obj_metamethods" parameter passed to the Class constructor.

This allows classes and objects to use different metamethods.

Multiple inheritence


When creating classes, you can specify a list of parent classes. New classes will inherit:
  • methods
  • metamethods (both for objects and classes themselves)
  • Fields. (and it's possible to individually overload the converters, factories, etc.)
  • "new_obj_func", "init", and "post_init" functions.

Inheritance is done by performing a depth-first search, and pulls in things from all ancestors. Of course, things are only inherited if they weren't redefined in the new class.

Syntax for creating new classes:

You can create new classes by using

local cls = Class.new(class_params, base)

  • class_params: lets you specify various aspects of how the class should behave.
  • base: the table that the generated class will be stored in. If not provided, a new table will be made.
    Anything that would be specified in "base" can also be specified before or after creating a class.

Here's a complete example of what this could look like:

local cls
cls = Class.new(
    {
        name = name_of_the_class,  -- OPTIONAL, a string that will be used in the default print method
                              -- it can also be accessed later. can be helpful when debugging.
        parents = {parent1, parent2, ...}, -- a list of all parent classes
  
        fields = {
            {"field1", eq=true, comp=tonumber, tostring=true, default="10"},
            {"field2", tostring=true},
            {"field3", converter=tonumber},
            ...
        },
        obj_metatable = {metamethods used by objects},
        cls_metatable = {metamethods used by the class},
      
        init = function(self, ...)
           do_stuff_with_the_passed_parameters -- this happens before `converters` and `post_init` are used
        end,

        post_init = function(self)
            do stuff after initializing and converting
        end,
        new_obj_func = "obj_data_table" OR "no_obj_data_table" OR `some function to create new objects`
    },

    base -- the thing that will be modified by the `Class.new` method. if this isn't provided, a new table will be
          -- created. if this is provided (and it's a table), then `cls == base` will return true.
)


Default metamethods
All classes inherit from one base class, aptly named "Class".

NOTE: This inheritance is done after normal inheritance, meaning it will only happen if no parent classes (or any of their parent classes, etc) specified their own behavior for those metamethods.

The inherited object metamethods are:
  • __tostring: objects will print their class name, followed by all their object-specific field values, followed by default values they got from parent classes (in order of inheritence).
    Note: by default, fields beginning with an underscore "_" are not printed, and fields that are function values are also not printed (to avoid spam).
  •  __concat: This one works as follows: let's say we have a class (called "cls"), and an object (called "obj1") of class, and "obj1" has the following field values: obj1.a = 1; obj1.b = 2; obj1.c = 3; obj1.d = 4
    We can use the concat method to create a copy of obj1 with a few fields changed. The syntax is:obj2 = obj1 .. {a = 4, c = 7}After writing the above code, we will have a new object, "obj2", with fields equal toobj2.a == 4; obj2.b == 2; obj2.c == 7; obj2.d == 4
    "obj2" will be an instance of the same class as "obj1", and will thus inherit all the functionality that "obj1" does.Note: This syntax also allows you to create clones of objects, by writing something like "obj3 = obj1 .. {}".

The inherited class metamethods are:
  • __tostring: This one does pretty much the same thing as the object "__tostring" metamethod (just skip the part about object-specific fields because that doesn't apply here).
  • __call: This lets you create new objects by typing cls(NEW_OBJ_PARAMETERS)
    instead of cls.new(NEW_OBJ_PARAMETERS)The two ways of creating new objects do exactly the same things. One is just a shorthand for the other.


NOTE: The "__index" metamethod is defined for all classes and all objects, and it should not be overwritten. The "__index" metamethod is how most of this functionality is implemented, so it is extremely likely that things will break should it be altered.

Modifying object ceration


By default, new objects are created by typing obj = cls.new(obj_data, ...)

The "obj_data" table then serves as the base for the newly created object, so that obj == obj_data  --> true


The "init" function: called before all converters/factories

The extra parameters ("...") are passed to the classes "init" method, if defined. This init method is called during object creation (just like in python). For example, if "init" is defined as
cls = Class.new{other_params, ...
   init = function(self, length, width, height)
      self.volume = length * width * height
   end
}

then you can construct a new object by typing
obj = cls.new({other_stuff}, 5,10,2)

This provides a nice syntax for initializing fields that depend on values that you don't want to store in the object.


The "post_init" function: called after "init", and after all converters/factories

As you may expect, this is called after the init method.
This is a good place to check whether values were passed properly, initialize defaults, convert any fields, etc. The syntax for defining this is
cls = Class.new{other_params, ...
   post_init = function(self)
      do stuff
      ...
   end
}


The "new_obj_func" parameter: called before "init"

You probably don't need to use this one very option. (It's analogous to defining "__new__" in Python.)

There are two presets available:
  • "obj_data_table". this is the default value, and it allows the construction of objects using the following syntax:
    obj = cls.new(obj_data, param1, param2, ...)
  • "no_obj_data_table". This allows for a more "Pythonic" way of creating classes and objects. It will allow you construct new objects and classes by writingobj = cls.new(param1, param2, ...)
    or obj = cls(param1, param2, ...)
    Note: The passed parameters will do nothing if an "init" method is not defined. Since "converters" and "post_init" both run after "init", they will still be supported by this syntax.


Note: Objects will inherit the value chosen by these presets, so you don't need to specify this each time you create a subclass.

You can choose either of the two options above by writing
cls = Class.new{other_params, ...
   new_obj_func = "obj_data_table" -- or write "no_obj_data_table" for the second option
}

If you want something else, you can manually define "new_obj_func". It can take in any number of parameters, and should return a `table`, or `false`, or `nil`. If a table is returned, that table will pass through the normal object creation process. If false or nil is returned, no object will be created.


Accessing "class_params" after object creation

Many of the values passed to `class_params` are available after objects are created. They will be stored in a `__secrets` table, should you wish to access them later. for example, to find a list of parent classes, you can write
cls.__secrets.parents

Note: after a class is created, two entries will be added to parents:
  • `parents[0] = cls`, so that the class itself can be easily accesed (eg by objects of that class). this is helpful when creating `for` loops and the like.
  • `parents[#parents] = Class`. All classes inherit from `Class` itself, so this is a handy way of taking that into account when doing things like writing `for` loops.

You can also access the `obj_metatable`, `cls_metatable`, `name`, `init`, `converters`, `post_init`, `new_obj_func` parameters from the `__secrets` table after the class is created.

This is particularly useful when you want to define `init` and `post_init` methods in subclasses, and then call the implementation of those methods in the parent class as well. Doing so would look something like
post_init = function(self)
   do stuff before initializing parent
   ...
   if cls.__secrets.parents[1].post_init ~= nil then -- you could also loop over all parent classes if youd like
      cls.__secrets.parents[1].post_init(self)
   end
   do stuff after initializing parent
   ...
end


Other methods defined by `Class`

`Class` has a few other methods that can be helpful when doing object-oriented programming:
  • Class.is_instance_of(obj, cls): returns true if `obj` is an instance of `cls`. (This will return false if `obj` is a suclass of `cls`).
  • Class.is_subclass_of(cls1, cls2): returns true if `cls1` is a subclass of `cls2`. (This will return false if `cls1` is an instance of `cls2`).
  • Class.get_all_ancestors(cls): This returns a list of all ancestors of `cls`, obtained by a depth-first search. In other words, it will return a list of the form
    Class.get_all_ancestors(cls) == {[0]=cls, parent1, parent1_of_parent1, ..., parent2_of_parent1, ..., parent2, parent1_of_parent2, ..., parent2_of_parent2, ..., ..., Class}
    This is in contrast to `cls.__secrets.parents` which will only contains the direct parents of a class. More specifically,cls.__secrets.parents == {[0]=cls, parent1, parent2, parent3, ..., Class}
  • Class.get_class(obj): returns the class that an object belongs to. (If called on a class, it will return the class itself.)


Note: The methods above can also be accessed directly by created classes and objects (assuming they were overwritten by the classes in question). This means the following syntax is also supported:
1) obj:is_instance_of(cls)
2) cls1:is_subclass_of(cls2)
3) cls:get_all_ancestors()
4) obj:get_class()


This wraps up the documentation for creating new classes. Onto the next features.

Note: All subsequent features are implemented by using `Class`. So, they also serve as some nice examples of how `Class` can be used in practice.

Spline Interpolation

Credit to: these lecture notes on spline interpolation and these lecture notes on solving tridiagonal systems, which were used as a reference when programming spline interpolation.

For those unfamiliar with the topic, the basic idea of spline interpolation can be broken down as follows:

  • Some functions are complicated (or we don't know what they are, we only know the values of those functions at a few points).
  • Polynomials are not complicated (they are just addition and multiplication afterall).
  • It would be nice if we could approximate a complicated function with a polynomial.
  • If a function is really complicated, or if we want to approximate that function over a wide range of input values, then even the approximating polynomials get really complicated :(
  • What if we break the range of input values into a bunch of smaller intervals, then approximated each of those with a polynomial?
  • Hey, this works pretty well! We just need to make sure the different polynomials are "doing the same thing" whenever the intervals overlap.

In otherwords, they're a way of approximating complicated functions by using a bunch of simple functions. It can be useful if:
  • you want to use a very complicated real-valued function (ie takes in a number, spits out a number), but that function takes a long time to compute some values.
  • you want to plot a "smooth curve" between a bunch of points, but you don't know of a function that would do that.


Note: a lot of the builtin math functions (like `log`, `exp`, `sin`, etc) are already pretty optimized, so it's unlikely this would help (it may even be slower in some cases). But, on some particularly nasty functions (e.g. a really recursive function), a spline can be multiple orders of magnitude faster.

Creating a spline

There are two kinds of polynomial splines included in this mod: "Polynomial_Spline", and "Evenly_Spaced_Polynomial_Spline".
For most use-cases, I would recommend using the "Evenly_Spaced" one, since it can evaluate points in constant time, while the regular "Polynomial_Spline" evaluates points in linear time. (Although I may rewrite it to use a binary search algorithm in the future.)

To create a new Polynomial_Spline: write
local spline = Polynomial_Spline.new{f=function_to_interpolate, points={list_of_points}}

Then,
  • The function `f` is evaluated at each of the `points`, and these values are used to generate the approximation.
  • So, the closer together the `points` are, the better the approximation will be.
  • However, the closer together thee `points` are, the more space the spline will take up more space in memory (probably still a neglible amount), and it will also take longer to compute the values (this is the real tradeoff but again, it only affects the normal Polynomial_Spline).

To create a new Evenly_Spaced_Polynomial_Spline:
write
local spline = Evenly_Spaced_Polynomial_Spline.new{
   f=function_to_interpolate,
   dist = some_number1,         -- the distance between the points being interpolated
   lower_bound = some_number2,  -- the smallest number to approximate the function at
   upper_bound = some_number3,  -- the highest  number to approximate the function at
}


Then,
  • A list of points will be created, and they will also be the same distance apart (this distance is specified by `dist`).
  • The function `f` is evaluated at each of the `points`, and these values are used to generate the approximation.
  • So, the smaller `dist` is, the better the approximation will be.
  • However, smaller values of `dist` will take up more memory. (Even more-so than in "Polynomial_Spline", because there are two arrays being used to keep track of points. (One for the points themselves, another that keeps track of what index a point is stored at).

Alternatively, you can create a new "Polynomial_Spline" or "Evenly_Spaced_Polynomial_Spline" by writing
local Spline = Polynomial_Spline.new_from_xy_pairs{
      {x1,y1},
      {x2,y2},
      ...,
      {x_n,y_n},
}

(The same works by replacing "Polynomial_Spline" with "Evenly_Spaced_Polynomial_Spline", but to do this you must ensure the points are evenly spaced, i.e. consecutive points are the same fixed distance from each other.)

This will have the effect of creating a "smooth curve" that passes between the specified points.

Evaluating a spline

Newly made splines can be evaluated by typing
spline(number) -- this will compute the value of the function being approximated, at the specified number



Lastly, are two other math utilities added by this library:

Polynomial

This is a class that that allows you to create polynomials, add/multiply them, and evaluate them.

  • create polynomials: you can do this by writing
    p = Polynomial(1,2,3,4) -- will create the polynomial 4t^3 + 3t^2 + 2t + 1

    or by importing `Polynomial.t`, and adding/multiplying that. For example:
    local t = Polynomial.t
    local p = 4*t^3 + 3*t^2 + 2*t + 1
  • add/substract/multiply polynomials: as seen in the example above
  • evaluate polynomials: to evaluate the polynomial `p` defined above: write
    p(some_number)

Rational

This is a class that lets you store fractions of things, as well as add and multiply them. It also works on Polynomials, allowing them to be divided. For example:local r = Rational(2,5)              -- creates the fraction 2/5
print(r / 10)                        --> 20/5
print( (t+1) / (t^2-1))              --> (t+1) / (t^2 - 1)
print( (t-1) / ( (t-1)^2 - 5*t^6 / (t^2-t+1) ) ) -- returns
--> (t^4 - t^3 + t^2 - 1) / (-1t^6 + t^5 - 2t^4 + 2t^3 - t^2 - t + 1)


It's also possible to `__call` a `Rational`, provided both the numerator and denominator either: are numbers, or have a `__call` metamethod:local r1 = (t-1) / ( (t-1)^2 - t^6/(t^3+t+1))
local r2 = 5/(t^3-2*t+1)

print( r1(2) ) --> -0.20754716981132
print( r2(3) ) --> 0.22727272727273