Skip to content

Commit

Permalink
Prepare 1.6.5 release. (#366)
Browse files Browse the repository at this point in the history
* Prepare 1.6.5 release.

* add pytest in dependencies

* disable coverage in CI

* pytest path issue on Windows

* one more try on version.

* debug the CI build issue.

* using python -m pip

* installation

* restore the docs sub-folder.
  • Loading branch information
wenbingl authored Jan 28, 2020
1 parent 89b2c2c commit 68a04ee
Show file tree
Hide file tree
Showing 17 changed files with 138 additions and 251 deletions.
15 changes: 6 additions & 9 deletions .azure-pipelines/linux-CI-nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,15 +35,12 @@ jobs:
conda install -c conda-forge protobuf
conda install -c conda-forge numpy
conda install -c conda-forge cmake
pip install $(ONNX_PATH)
git clone https://github.com/microsoft/onnxconverter-common
cd onnxconverter-common
pip install -e .
cd ..
pip install -r requirements.txt
pip install -r requirements-dev.txt
pip install $(ORT_PATH)
pip install pytest
python -m pip install $(ONNX_PATH)
python -m pip install git+https://github.com/microsoft/onnxconverter-common
python -m pip install -r requirements.txt
python -m pip install -r requirements-dev.txt
python -m pip install $(ORT_PATH)
python -m pip install pytest
git clone --recursive https://github.com/cjlin1/libsvm libsvm
cd libsvm
make lib
Expand Down
9 changes: 0 additions & 9 deletions .azure-pipelines/linux-conda-CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -71,15 +71,6 @@ jobs:
pytest tests --ignore=tests/sparkml --doctest-modules --junitxml=junit/test-results.xml
displayName: 'pytest - onnxmltools'
- script: |
export PYTHONPATH=$PYTHONPATH:libsvm/python
python -c "import onnxconverter_common"
python -c "import onnxmltools"
coverage run --include=onnxmltools/** tests/main.py
coverage report -m
coverage html
displayName: 'coverage'
- task: PublishTestResults@2
inputs:
testResultsFiles: '**/test-results.xml'
Expand Down
2 changes: 1 addition & 1 deletion .azure-pipelines/win32-CI-nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ jobs:
call activate py$(python.version)
set PYTHONPATH=libsvm\python;%PYTHONPATH%
pip install -e .
pytest tests --ignore=tests/sparkml --doctest-modules --junitxml=junit/test-results.xml
python -m pytest tests --ignore=tests/sparkml --doctest-modules --junitxml=junit/test-results.xml
displayName: 'pytest - onnxmltools'
- task: PublishTestResults@2
Expand Down
19 changes: 9 additions & 10 deletions .azure-pipelines/win32-conda-CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,29 +55,28 @@ jobs:
call activate py$(python.version)
python -m pip install --upgrade pip numpy
echo Test numpy installation... && python -c "import numpy"
pip install %COREML_PATH% %ONNX_PATH%
git clone https://github.com/microsoft/onnxconverter-common
cd onnxconverter-common
pip install -e .
cd ..
python -m pip install %COREML_PATH% %ONNX_PATH%
python -m pip install git+https://github.com/microsoft/onnxconverter-common
echo Test onnxconverter-common installation... && python -c "import onnxconverter_common"
pip install -r requirements.txt
pip install -r requirements-dev.txt
pip install %ONNXRT_PATH%
python -m pip install -r requirements.txt
python -m pip install -r requirements-dev.txt
python -m pip install %ONNXRT_PATH%
echo Test onnxruntime installation... && python -c "import onnxruntime"
REM install libsvm from github
git clone --recursive https://github.com/cjlin1/libsvm libsvm
copy libsvm\windows\*.dll libsvm\python
set PYTHONPATH=libsvm\python;%PYTHONPATH%
dir libsvm\python
echo Test libsvm installation... && python -c "import svmutil"
echo "debug environment" && path
python -m pip show pytest
displayName: 'Install dependencies'
- script: |
call activate py$(python.version)
set PYTHONPATH=libsvm\python;%PYTHONPATH%
pip install -e .
pytest tests --ignore=tests/sparkml --doctest-modules --junitxml=junit/test-results.xml
python -m pip install -e .
python -m pytest tests --ignore=tests/sparkml --doctest-modules --junitxml=junit/test-results.xml
displayName: 'pytest - onnxmltools'
- task: PublishTestResults@2
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ build/
# test generated files
.pytest_cache
.cache
tests/temp
tests/utils/models/coreml_OneHotEncoder_BikeSharing_new.json
tests/utils/models/coreml_OneHotEncoder_BikeSharing2.onnx
tests/baseline/outmodels
Expand Down
17 changes: 4 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ ONNXMLTools enables you to convert models from different machine learning toolki
* libsvm
* XGBoost
* H2O
<p>Pytorch has its builtin ONNX exporter check <a href="https://pytorch.org/docs/stable/onnx.html">here</a> for details</p>

## Install
You can install latest release of ONNXMLTools from [PyPi](https://pypi.org/project/onnxmltools/):
Expand Down Expand Up @@ -96,11 +97,6 @@ onnx_model = onnxmltools.convert_coreml(coreml_model, 'Example Model')
onnxmltools.utils.save_model(onnx_model, 'example.onnx')
```

## Spark ML to ONNX Conversion (experimental)
Please refer to the following documents:
* [Conversion Framework](onnxmltools/convert/README.md)
* [Spark ML to ONNX Model Conversion](onnxmltools/convert/sparkml/README.md)

## H2O to ONNX Conversion
Below is a code snippet to convert a H2O MOJO model into an ONNX model. The only pre-requisity is to have a MOJO model saved on the local file-system.

Expand Down Expand Up @@ -136,17 +132,12 @@ Documentation for the [ONNX Model format](https://github.com/onnx/onnx) and more

## Test all existing converters

There exists a way
to automatically check every converter with
All converter unit test can generate the original model and converted model to automatically be checked with
[onnxruntime](https://pypi.org/project/onnxruntime/) or
[onnxruntime-gpu](https://pypi.org/project/onnxruntime-gpu/).
This process requires the user to clone the *onnxmltools* repository.
The following command runs all unit tests and generates
dumps of models, inputs, expected outputs and converted models
in folder ``TESTDUMP``.

The unit test cases are all the normal python unit test cases, you can run it with pytest command line, for example:
```
python tests/main.py DUMP
python -m pytest --ignore .\tests\
```

It requires *onnxruntime*, *numpy* for most models,
Expand Down
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ to automatically check every converter with
from onnxmltools import convert_sklearn
from onnxmltools.utils import save_model
from onnxconverter_common.data_types import FloatTensorType
initial_type = [('float_input', FloatTensorType([-1, 4]))]
initial_type = [('float_input', FloatTensorType([1, 4]))]
onx = convert_sklearn(clr, initial_types=initial_type)
save_model(onx, "rf_iris.onnx")

Expand Down
4 changes: 1 addition & 3 deletions docs/tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ to convert other model formats into ONNX. Here we will use
from onnxmltools.utils import save_model
from onnxmltools.convert.common.data_types import FloatTensorType

initial_type = [('float_input', FloatTensorType([-1, 4]))]
initial_type = [('float_input', FloatTensorType([1, 4]))]
onx = convert_sklearn(clr, initial_types=initial_type)
save_model(onx, "logreg_iris.onnx")

Expand All @@ -67,9 +67,7 @@ for this machine learning model.
::

import onnxruntime as rt
import numpy
sess = rt.InferenceSession("logreg_iris.onnx")
label_name = sess.get_outputs()[0].name
input_name = sess.get_inputs()[0].name

pred_onx = sess.run([label_name], {input_name: X_test.astype(numpy.float32)})[0]
2 changes: 1 addition & 1 deletion onnxmltools/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
This framework converts any machine learned model into onnx format
which is a common language to describe any machine learned model.
"""
__version__ = "1.6.0"
__version__ = "1.6.5"
__author__ = "Microsoft"
__producer__ = "OnnxMLTools"
__producer_version__ = __version__
Expand Down
103 changes: 88 additions & 15 deletions onnxmltools/convert/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,11 @@
from ..proto import onnx
from .common import utils
import warnings
import importlib


def convert_coreml(model, name=None, initial_types=None, doc_string='', target_opset=None,
targeted_onnx=onnx.__version__ , custom_conversion_functions=None, custom_shape_calculators=None):
targeted_onnx=onnx.__version__, custom_conversion_functions=None, custom_shape_calculators=None):
if not utils.coreml_installed():
raise RuntimeError('coremltools is not installed. Please install coremltools to use this feature.')

Expand All @@ -33,7 +35,7 @@ def convert_keras(model, name=None, initial_types=None, doc_string='',


def convert_libsvm(model, name=None, initial_types=None, doc_string='', target_opset=None,
targeted_onnx=onnx.__version__, custom_conversion_functions=None, custom_shape_calculators=None):
targeted_onnx=onnx.__version__, custom_conversion_functions=None, custom_shape_calculators=None):
if not utils.libsvm_installed():
raise RuntimeError('libsvm is not installed. Please install libsvm to use this feature.')

Expand Down Expand Up @@ -62,7 +64,8 @@ def convert_sklearn(model, name=None, initial_types=None, doc_string='', target_

from skl2onnx.convert import convert_sklearn as convert_skl2onnx
return convert_skl2onnx(model, name, initial_types, doc_string, target_opset,
custom_conversion_functions, custom_shape_calculators)
custom_conversion_functions, custom_shape_calculators)


def convert_sparkml(model, name=None, initial_types=None, doc_string='', target_opset=None,
targeted_onnx=onnx.__version__, custom_conversion_functions=None,
Expand All @@ -74,18 +77,6 @@ def convert_sparkml(model, name=None, initial_types=None, doc_string='', target_
return convert(model, name, initial_types, doc_string, target_opset, targeted_onnx,
custom_conversion_functions, custom_shape_calculators, spark_session)

def convert_tensorflow(frozen_graph_def,
name=None, input_names=None, output_names=None,
doc_string='',
target_opset=None,
channel_first_inputs=None,
debug_mode=False, custom_op_conversions=None):
if not utils.keras2onnx_installed():
raise RuntimeError('keras2onnx is not installed. Please install it to use this feature.')

from keras2onnx import convert_tensorflow as convert
return convert(frozen_graph_def, name, input_names, output_names, doc_string,
target_opset, channel_first_inputs, debug_mode, custom_op_conversions)

def convert_xgboost(*args, **kwargs):
if not utils.xgboost_installed():
Expand All @@ -94,9 +85,91 @@ def convert_xgboost(*args, **kwargs):
from .xgboost.convert import convert
return convert(*args, **kwargs)


def convert_h2o(*args, **kwargs):
if not utils.h2o_installed():
raise RuntimeError('h2o is not installed. Please install h2o to use this feature.')

from .h2o.convert import convert
return convert(*args, **kwargs)


def _collect_input_nodes(graph, outputs):
nodes_to_keep = set()
input_nodes = set()
node_inputs = [graph.get_tensor_by_name(ts_).op for ts_ in outputs]
while node_inputs:
nd_ = node_inputs[0]
del node_inputs[0]
if nd_.type in ['Placeholder', "PlaceholderV2", 'PlaceholderWithDefault']:
input_nodes.add(nd_)
if nd_ in nodes_to_keep:
continue

nodes_to_keep.add(nd_)
node_inputs.extend(in_.op for in_ in nd_.inputs)

return input_nodes, nodes_to_keep


def _convert_tf_wrapper(frozen_graph_def,
name=None, input_names=None, output_names=None,
doc_string='',
target_opset=None,
channel_first_inputs=None,
debug_mode=False, custom_op_conversions=None):
"""
convert a tensorflow graph def into a ONNX model proto, just like how keras does.
:param graph_def: the frozen tensorflow graph
:param name: the converted onnx model internal name
:param input_names: the inputs name list of the model
:param output_names: the output name list of the model
:param doc_string: doc string
:param target_opset: the targeted onnx model opset
:param channel_first_inputs: A list of channel first input (not supported yet)
:param debug_mode: will enable the log and try to convert as much as possible on conversion
:return an ONNX ModelProto
"""
import tensorflow as tf
import tf2onnx

if target_opset is None:
target_opset = onnx.defs.onnx_opset_version()

if not doc_string:
doc_string = "converted from {}".format(name)

tf_graph_def = tf2onnx.tfonnx.tf_optimize(input_names, output_names, frozen_graph_def, True)
with tf.Graph().as_default() as tf_graph:
tf.import_graph_def(tf_graph_def, name='')

if not input_names:
input_nodes = list(_collect_input_nodes(tf_graph, output_names)[0])
input_names = [nd_.outputs[0].name for nd_ in input_nodes]
g = tf2onnx.tfonnx.process_tf_graph(tf_graph,
continue_on_error=debug_mode,
opset=target_opset,
custom_op_handlers=custom_op_conversions,
inputs_as_nchw=channel_first_inputs,
output_names=output_names,
input_names=input_names)

onnx_graph = tf2onnx.optimizer.optimize_graph(g)
model_proto = onnx_graph.make_model(doc_string)

return model_proto


def convert_tensorflow(frozen_graph_def,
name=None, input_names=None, output_names=None,
doc_string='',
target_opset=None,
channel_first_inputs=None,
debug_mode=False, custom_op_conversions=None):
try:
importlib.import_module('tf2onnx')
except (ImportError, ModuleNotFoundError) as e:
raise RuntimeError('tf2onnx is not installed, please install it before calling this function.')

return _convert_tf_wrapper(frozen_graph_def, name, input_names, output_names, doc_string,
target_opset, channel_first_inputs, debug_mode, custom_op_conversions)
2 changes: 1 addition & 1 deletion onnxmltools/utils/tests_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ def dump_data_and_model(data, model, onnx=None, basename="model", folder=None,
runtime_test = dict(model=model, data=data)

if folder is None:
folder = os.environ.get('ONNXTESTDUMP', 'tests')
folder = os.environ.get('ONNXTESTDUMP', 'tests/temp')
if not os.path.exists(folder):
os.makedirs(folder)

Expand Down
4 changes: 1 addition & 3 deletions requirements-dev.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,7 @@ numpy
openpyxl
pandas
protobuf
pyspark==2.3.2; sys_platform == 'win32'
pyspark; sys_platform == 'linux'
pyspark; python_version <= '2.7'
pytest
pytest-cov
scikit-learn
scipy
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
keras2onnx
numpy
onnx
onnxconverter-common>=1.6.1
onnxconverter-common>=1.6.5
protobuf
six
skl2onnx
Empty file removed tests/__init__.py
Empty file.
Loading

0 comments on commit 68a04ee

Please sign in to comment.