Skip to content

Commit b4e83e2

Browse files
committed
Release first version
1 parent 662895b commit b4e83e2

File tree

4 files changed

+71
-58
lines changed

4 files changed

+71
-58
lines changed

MANIFEST.in

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
include LICENSE

README.md

Lines changed: 60 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -1,84 +1,92 @@
11

22
# EmbML
3+
34
EmbML is a tool written in Python to automatically convert off-board-trained models into C++ source code files that can be compiled and executed in low-power microcontrollers. The main goal of EmbML is to produce classifier source codes that will run specifically in unresourceful hardware systems, using bare metal programming.
45

56
This tool takes as input a classification model that was trained in a desktop or server computer using WEKA or scikit-learn libraries. EmbML is responsible for converting the input model into a carefully crafted C++ code with support for embedded hardware, such as the avoidance of unnecessary use of main memory and implementation of fixed-point operations for non-integer numbers.
67

7-
# Input Models
8+
## Input Models
9+
810
EmbML accepts a trained model through the file that contains its serialized object. For instance, a classification model, built with WEKA, shall be serialized into a file using the _ObjectOutputStream_ and _FileOutputStream_ classes (available in Java). As for the scikit-learn models, they shall be saved using the _dump_ function, from _pickle_ module.
911

10-
# Supported Classification Models
12+
## Supported Classification Models
13+
1114
`embml` supports off-board-trained classifiers from the following classes:
12-
* From WEKA:
13-
* _MultilayerPerceptron_ for MLP classifiers;
14-
* _Logistic_ for logistic regression classifiers;
15-
* _SMO_ for SVM classifiers -- with linear, polynomial, and RBF kernels;
16-
* _J48_ for decision tree classifier.
15+
16+
* From WEKA:
17+
* _MultilayerPerceptron_ for MLP classifiers;
18+
* _Logistic_ for logistic regression classifiers;
19+
* _SMO_ for SVM classifiers -- with linear, polynomial, and RBF kernels;
20+
* _J48_ for decision tree classifier.
1721
* From scikit-learn:
1822
* _MLPClassifier_ for MLP classifiers;
1923
* _LogisticRegression_ for logistic regression classifiers;
2024
* _LinearSVC_ for SVM classifiers with linear kernel;
2125
* _SVC_ for SVM classifiers -- with polynomial and RBF kernels;
2226
* _DecisionTreeClassifier_ for decision tree models.
2327

24-
# Installation
28+
## Installation
29+
2530
You can install `embml` from [PyPi](https://pypi.org/project/embml/):
31+
2632
```python
27-
pip install embml
33+
pip install embml
2834
```
35+
2936
This tool is supported on Python 2.7 and Python 3.5 version, and depends on the `javaobj` library.
3037

38+
## How To Use
3139

32-
# How To Use
3340
```python
34-
import embml
41+
import embml
3542

36-
# For scikit-learn models
37-
embml.sklearnModel(inputModel, outputFile, opts)
43+
# For scikit-learn models
44+
embml.sklearnModel(inputModel, outputFile, opts)
3845

39-
# For WEKA models
40-
embml.wekaModel(inputModel, outputFile, opts)
41-
42-
# opts can include:
43-
#-rules: to generate a decision tree classifier code using if-then-else format.
44-
#-fxp <n> <m>: to generate a classifier code that uses fixed-point format to perform real number operations. In this case, <n> is the number of integer bits and <m> is the number of fractional bits in the Qn.m format. Note that n + m + 1 must be equal to 32, 16, or 8, since that one bit is used to represent signed numbers.
45-
#-approx: to generate an MLP classifier code that employs an approximation to substitute the sigmoid as an activation function in the neurons.
46-
#-pwl <x>: to generate an MLP classifier code that employs a piecewise approximation to substitute the sigmoid as an activation function in the neurons. In this case, <x> must be equal to 2 (to use an 2-point PWL approximation) or 4 (to use an 4-point PWL approximation).
47-
48-
# Examples of generating decision tree classifier codes using if-then-else format.
49-
embml.wekaModel(inputDecisionTreeModel, outputFile, opts='-rules')
50-
embml.sklearnModel(inputDecisionTreeModel, outputFile, opts='-rules')
51-
52-
# Examples of generating classifier codes using fixed-point formats.
53-
embml.wekaModel(inputModel, outputFile, opts='-fxp 21 10') # Q21.10
54-
embml.sklearnModel(inputModel, outputFile, opts='-fxp 21 10') # Q21.10
55-
embml.wekaModel(inputModel, outputFile, opts='-fxp 11 4') # Q11.4
56-
embml.sklearnModel(inputModel, outputFile, opts='-fxp 11 4') # Q11.4
57-
embml.wekaModel(inputModel, outputFile, opts='-fxp 5 2') # Q5.2
58-
embml.sklearnModel(inputModel, outputFile, opts='-fxp 5 2') # Q5.2
59-
60-
# Examples of generating MLP classifier codes using an approximation function.
61-
embml.wekaModel(inputMlpModel, outputFile, opts='-approx')
62-
embml.sklearnModel(inputMlpModel, outputFile, opts='-approx')
63-
64-
# Examples of generating MLP classifier codes using PWL approximations.
65-
embml.wekaModel(inputMlpModel, outputFile, opts='-pwl 2')
66-
embml.sklearnModel(inputMlpModel, outputFile, opts='-pwl 2')
67-
embml.wekaModel(inputMlpModel, outputFile, opts='-pwl 4')
68-
embml.sklearnModel(inputMlpModel, outputFile, opts='-pwl 4')
69-
70-
# It is also possible to combine some options:
71-
embml.wekaModel(inputMlpModel, outputFile, opts='-fxp 21 10 -pwl 2')
72-
embml.sklearnModel(inputMlpModel, outputFile, opts='-fxp 21 10 -pwl 2')
73-
embml.wekaModel(inputDecisionTreeModel, outputFile, opts='-fxp 21 10 -rules')
74-
embml.sklearnModel(inputDecisionTreeModel, outputFile, opts='-fxp 21 10 -rules')
46+
# For WEKA models
47+
embml.wekaModel(inputModel, outputFile, opts)
7548

49+
# opts can include:
50+
#-rules: to generate a decision tree classifier code using if-then-else format.
51+
#-fxp <n> <m>: to generate a classifier code that uses fixed-point format to perform real number operations. In this case, <n> is the number of integer bits and <m> is the number of fractional bits in the Qn.m format. Note that n + m + 1 must be equal to 32, 16, or 8, since that one bit is used to represent signed numbers.
52+
#-approx: to generate an MLP classifier code that employs an approximation to substitute the sigmoid as an activation function in the neurons.
53+
#-pwl <x>: to generate an MLP classifier code that employs a piecewise approximation to substitute the sigmoid as an activation function in the neurons. In this case, <x> must be equal to 2 (to use an 2-point PWL approximation) or 4 (to use an 4-point PWL approximation).
54+
55+
# Examples of generating decision tree classifier codes using if-then-else format.
56+
embml.wekaModel(inputDecisionTreeModel, outputFile, opts='-rules')
57+
embml.sklearnModel(inputDecisionTreeModel, outputFile, opts='-rules')
58+
59+
# Examples of generating classifier codes using fixed-point formats.
60+
embml.wekaModel(inputModel, outputFile, opts='-fxp 21 10') # Q21.10
61+
embml.sklearnModel(inputModel, outputFile, opts='-fxp 21 10') # Q21.10
62+
embml.wekaModel(inputModel, outputFile, opts='-fxp 11 4') # Q11.4
63+
embml.sklearnModel(inputModel, outputFile, opts='-fxp 11 4') # Q11.4
64+
embml.wekaModel(inputModel, outputFile, opts='-fxp 5 2') # Q5.2
65+
embml.sklearnModel(inputModel, outputFile, opts='-fxp 5 2') # Q5.2
66+
67+
# Examples of generating MLP classifier codes using an approximation function.
68+
embml.wekaModel(inputMlpModel, outputFile, opts='-approx')
69+
embml.sklearnModel(inputMlpModel, outputFile, opts='-approx')
70+
71+
# Examples of generating MLP classifier codes using PWL approximations.
72+
embml.wekaModel(inputMlpModel, outputFile, opts='-pwl 2')
73+
embml.sklearnModel(inputMlpModel, outputFile, opts='-pwl 2')
74+
embml.wekaModel(inputMlpModel, outputFile, opts='-pwl 4')
75+
embml.sklearnModel(inputMlpModel, outputFile, opts='-pwl 4')
76+
77+
# It is also possible to combine some options:
78+
embml.wekaModel(inputMlpModel, outputFile, opts='-fxp 21 10 -pwl 2')
79+
embml.sklearnModel(inputMlpModel, outputFile, opts='-fxp 21 10 -pwl 2')
80+
embml.wekaModel(inputDecisionTreeModel, outputFile, opts='-fxp 21 10 -rules')
81+
embml.sklearnModel(inputDecisionTreeModel, outputFile, opts='-fxp 21 10 -rules')
7682
```
7783

78-
# Fixed-point library
84+
## Fixed-point library
85+
7986
If you decide to generate a classifier code using a fixed-point format, you need to include the `FixedNum.h` library available at [https://github.com/lucastsutsui/EmbML](https://github.com/lucastsutsui/EmbML).
8087

81-
# Citation
88+
## Citation
89+
8290
If you use this tool on a scientific work, we kindly ask you to use the following reference:
8391

8492
```tex
@@ -91,4 +99,4 @@ If you use this tool on a scientific work, we kindly ask you to use the followin
9199
organization={IEEE}
92100
}
93101
```
94-
102+

setup.cfg

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +0,0 @@
1-
[metadata]
2-
description-file = README.md

setup.py

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,18 @@
1-
import setuptools
1+
from setuptools import setup,find_packages
22

3-
setuptools.setup(
3+
with open("README.md", encoding='utf-8') as fh:
4+
long_description = fh.read()
5+
6+
setup(
47
name="embml",
5-
version="0.0.2",
8+
version="0.0.4",
69
author="Lucas Tsutsui da Silva",
710
author_email="lucastsui@hotmail.com",
811
description="A tool to support using classification models in low-power microcontroller-based hardware",
12+
long_description=long_description,
13+
long_description_content_type="text/markdown",
914
url="https://github.com/lucastsutsui/embml",
10-
packages=setuptools.find_packages(),
15+
packages=find_packages(),
1116
classifiers=[
1217
"Programming Language :: Python :: 2.7",
1318
"Programming Language :: Python :: 3.5",
@@ -26,4 +31,5 @@
2631
'embedded'
2732
],
2833
python_requires='>=2.7',
34+
include_package_data=True,
2935
)

0 commit comments

Comments
 (0)