diff --git a/LICENSE b/LICENSE index 8973bf7..a6a8584 100644 --- a/LICENSE +++ b/LICENSE @@ -1,41 +1,217 @@ -SAP SAMPLE CODE LICENSE AGREEMENT - -Please scroll down and read the following SAP Sample Code License Agreement carefully ("Agreement"). By downloading, installing, or otherwise using the SAP sample code or any materials that accompany the sample code documentation (collectively, the "Sample Code"), You agree that this Agreement forms a legally binding agreement between You ("You" or "Your") and SAP SE, for and on behalf of itself and its subsidiaries and affiliates (as defined in Section 15 of the German Stock Corporation Act), and You agree to be bound by all of the terms and conditions stated in this Agreement. If You are trying to access or download the Sample Code on behalf of Your employer or as a consultant or agent of a third party (either "Your Company"), You represent and warrant that You have the authority to act on behalf of and bind Your Company to the terms of this Agreement and everywhere in this Agreement that refers to 'You' or 'Your' shall also include Your Company. If You do not agree to these terms, do not attempt to access or use the Sample Code. - -1. LICENSE: Subject to the terms of this Agreement, SAP grants You a non-exclusive, non-transferable, non-sublicensable, revocable, royalty-free, limited license to use, copy, and modify the Sample Code solely for Your internal business purposes. - -2. RESTRICTIONS: You must not use the Sample Code to: (a) impair, degrade or reduce the performance or security of any SAP products, services or related technology (collectively, "SAP Products"); (b) enable the bypassing or circumventing of SAP's license restrictions and/or provide users with access to the SAP Products to which such users are not licensed; or (c) permit mass data extraction from an SAP Product to a non-SAP Product, including use, modification, saving or other processing of such data in the non-SAP Product. Further, You must not: (i) provide or make the Sample Code available to any third party other than your authorized employees, contractors and agents (collectively, “Representatives”) and solely to be used by Your Representatives for Your own internal business purposes; ii) remove or modify any marks or proprietary notices from the Sample Code; iii) assign this Agreement, or any interest therein, to any third party; (iv) use any SAP name, trademark or logo without the prior written authorization of SAP; or (v) use the Sample Code to modify an SAP Product or decompile, disassemble or reverse engineer an SAP Product (except to the extent permitted by applicable law). You are responsible for any breach of the terms of this Agreement by You or Your Representatives. - -3. INTELLECTUAL PROPERTY: SAP or its licensors retain all ownership and intellectual property rights in and to the Sample Code and SAP Products. In exchange for the right to use, copy and modify the Sample Code provided under this Agreement, You covenant not to assert any intellectual property rights in or to any of Your products, services, or related technology that are based on or incorporate the Sample Code against any individual or entity in respect of any current or future SAP Products. - -4. SAP AND THIRD PARTY APIS: The Sample Code may include API (application programming interface) calls to SAP and third-party products or services. The access or use of the third-party products and services to which the API calls are directed may be subject to additional terms and conditions between you and SAP or such third parties. You (and not SAP) are solely responsible for understanding and complying with any additional terms and conditions that apply to the access or use of those APIs and/or third-party products and services. SAP does not grant You any rights in or to these APIs, products or services under this Agreement. - -5. FREE AND OPEN SOURCE COMPONENTS: The Sample Code may include third party free or open source components ("FOSS Components"). You may have additional rights in such FOSS Components that are provided by the third party licensors of those components. - -6. THIRD PARTY DEPENDENCIES: The Sample Code may require third party software dependencies ("Dependencies") for the use or operation of the Sample Code. These Dependencies may be identified by SAP in Maven POM files, documentation or by other means. SAP does not grant You any rights in or to such Dependencies under this Agreement. You are solely responsible for the acquisition, installation and use of such Dependencies. - -7. WARRANTY: -a) If You are located outside the US or Canada: AS THE SAMPLE CODE IS PROVIDED TO YOU FREE OF CHARGE, SAP DOES NOT GUARANTEE OR WARRANT ANY FEATURES OR QUALITIES OF THE SAMPLE CODE OR GIVE ANY UNDERTAKING WITH REGARD TO ANY OTHER QUALITY. NO SUCH WARRANTY OR UNDERTAKING SHALL BE IMPLIED BY YOU FROM ANY DESCRIPTION IN THE SAMPLE CODE OR ANY OTHER MATERIALS, COMMUNICATION OR ADVERTISEMENT. IN PARTICULAR, SAP DOES NOT WARRANT THAT THE SAMPLE CODE WILL BE AVAILABLE UNINTERRUPTED, ERROR FREE, OR PERMANENTLY AVAILABLE. ALL WARRANTY CLAIMS RESPECTING THE SAMPLE CODE ARE SUBJECT TO THE LIMITATION OF LIABILITY STIPULATED IN SECTION 8 BELOW. -b) If You are located in the US or Canada: THE SAMPLE CODE IS LICENSED TO YOU "AS IS", WITHOUT ANY WARRANTY, ESCROW, TRAINING, MAINTENANCE, OR SERVICE OBLIGATIONS WHATSOEVER ON THE PART OF SAP. SAP MAKES NO EXPRESS OR IMPLIED WARRANTIES OR CONDITIONS OF SALE OF ANY TYPE WHATSOEVER, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE. IN PARTICULAR, SAP DOES NOT WARRANT THAT THE SAMPLE CODE WILL BE AVAILABLE UNINTERRUPTED, ERROR FREE, OR PERMANENTLY AVAILABLE. YOU ASSUME ALL RISKS ASSOCIATED WITH THE USE OF THE SAMPLE CODE, INCLUDING WITHOUT LIMITATION RISKS RELATING TO QUALITY, AVAILABILITY, PERFORMANCE, DATA LOSS, AND UTILITY IN A PRODUCTION ENVIRONMENT. -c) For all locations: SAP DOES NOT MAKE ANY REPRESENTATIONS OR WARRANTIES IN RESPECT OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE. IN PARTICULAR, SAP DOES NOT WARRANT THAT THIRD-PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES WILL BE AVAILABLE, ERROR FREE, INTEROPERABLE WITH THE SAMPLE CODE, SUITABLE FOR ANY PARTICULAR PURPOSE OR NON-INFRINGING. YOU ASSUME ALL RISKS ASSOCIATED WITH THE USE OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES, INCLUDING WITHOUT LIMITATION RISKS RELATING TO QUALITY, AVAILABILITY, PERFORMANCE, DATA LOSS, UTILITY IN A PRODUCTION ENVIRONMENT, AND NON-INFRINGEMENT. IN NO EVENT WILL SAP BE LIABLE DIRECTLY OR INDIRECTLY IN RESPECT OF ANY USE OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES BY YOU. - -8. LIMITATION OF LIABILITY: -a) If You are located outside the US or Canada: IRRESPECTIVE OF THE LEGAL REASONS, SAP SHALL ONLY BE LIABLE FOR DAMAGES UNDER THIS AGREEMENT IF SUCH DAMAGE (I) CAN BE CLAIMED UNDER THE GERMAN PRODUCT LIABILITY ACT OR (II) IS CAUSED BY INTENTIONAL MISCONDUCT OF SAP OR (III) CONSISTS OF PERSONAL INJURY. IN ALL OTHER CASES, NEITHER SAP NOR ITS EMPLOYEES, AGENTS AND SUBCONTRACTORS SHALL BE LIABLE FOR ANY KIND OF DAMAGE OR CLAIMS HEREUNDER. -b) If You are located in the US or Canada: IN NO EVENT SHALL SAP BE LIABLE TO YOU, YOUR COMPANY OR TO ANY THIRD PARTY FOR ANY DAMAGES IN AN AMOUNT IN EXCESS OF $100 ARISING IN CONNECTION WITH YOUR USE OF OR INABILITY TO USE THE SAMPLE CODE OR IN CONNECTION WITH SAP'S PROVISION OF OR FAILURE TO PROVIDE SERVICES PERTAINING TO THE SAMPLE CODE, OR AS A RESULT OF ANY DEFECT IN THE SAMPLE CODE. THIS DISCLAIMER OF LIABILITY SHALL APPLY REGARDLESS OF THE FORM OF ACTION THAT MAY BE BROUGHT AGAINST SAP, WHETHER IN CONTRACT OR TORT, INCLUDING WITHOUT LIMITATION ANY ACTION FOR NEGLIGENCE. YOUR SOLE REMEDY IN THE EVENT OF BREACH OF THIS AGREEMENT BY SAP OR FOR ANY OTHER CLAIM RELATED TO THE SAMPLE CODE SHALL BE TERMINATION OF THIS AGREEMENT. NOTWITHSTANDING ANYTHING TO THE CONTRARY HEREIN, UNDER NO CIRCUMSTANCES SHALL SAP OR ITS LICENSORS BE LIABLE TO YOU OR ANY OTHER PERSON OR ENTITY FOR ANY SPECIAL, INCIDENTAL, CONSEQUENTIAL, OR INDIRECT DAMAGES, LOSS OF GOOD WILL OR BUSINESS PROFITS, WORK STOPPAGE, DATA LOSS, COMPUTER FAILURE OR MALFUNCTION, ANY AND ALL OTHER COMMERCIAL DAMAGES OR LOSS, OR EXEMPLARY OR PUNITIVE DAMAGES. - -9. INDEMNITY: You will fully indemnify, hold harmless and defend SAP against law suits based on any claim: (a) that any of Your products, services or related technology that are based on or incorporate the Sample Code infringes or misappropriates any patent, copyright, trademark, trade secrets, or other proprietary rights of a third party, or (b) related to Your alleged violation of the terms of this Agreement. - -10. EXPORT: The Sample Code is subject to German, EU and US export control regulations. You confirm that: a) You will not use the Sample Code for, and will not allow the Sample Code to be used for, any purposes prohibited by German, EU and US law, including, without limitation, for the development, design, manufacture or production of nuclear, chemical or biological weapons of mass destruction; b) You are not located in Cuba, Iran, Sudan, Iraq, North Korea, Syria, nor any other country to which the United States has prohibited export or that has been designated by the U.S. Government as a "terrorist supporting" country (any, an "US Embargoed Country"); c) You are not a citizen, national or resident of, and are not under the control of, a US Embargoed Country; d) You will not download or otherwise export or re-export the Sample Code, directly or indirectly, to a US Embargoed Country nor to citizens, nationals or residents of a US Embargoed Country; e) You are not listed on the United States Department of Treasury lists of Specially Designated Nationals, Specially Designated Terrorists, and Specially Designated Narcotic Traffickers, nor listed on the United States Department of Commerce Table of Denial Orders or any other U.S. government list of prohibited or restricted parties and f) You will not download or otherwise export or re-export the Sample Code, directly or indirectly, to persons on the above-mentioned lists. - -11. SUPPORT: SAP does not offer support for the Sample Code. - -12. TERM AND TERMINATION: You may terminate this Agreement by destroying all copies of the Sample Code in Your possession or control. SAP may terminate Your license to use the Sample Code immediately if You fail to comply with any of the terms of this Agreement, or, for SAP's convenience by providing you with ten (10) days written notice of termination. In case of termination or expiration of this Agreement, You must immediately destroy all copies of the Sample Code in your possession or control. In the event Your Company is acquired (by merger, purchase of stock, assets or intellectual property or exclusive license), or You become employed, by a direct competitor of SAP, then this Agreement and all licenses granted to You in this Agreement shall immediately terminate upon the date of such acquisition or change of employment. - -13. LAW/VENUE: -a) If You are located outside the US or Canada: This Agreement is governed by and construed in accordance with the laws of Germany without reference to its conflicts of law principles. You and SAP agree to submit to the exclusive jurisdiction of, and venue in, the courts located in Karlsruhe, Germany in any dispute arising out of or relating to this Agreement or the Sample Code. The United Nations Convention on Contracts for the International Sale of Goods shall not apply to this Agreement. -b) If You are located in the US or Canada: This Agreement shall be governed by and construed in accordance with the laws of the State of New York, USA without reference to its conflicts of law principles. You and SAP agree to submit to the exclusive jurisdiction of, and venue in, the courts located in New York, New York, USA in any dispute arising out of or relating to this Agreement or the Sample Code. The United Nations Convention on Contracts for the International Sale of Goods shall not apply to this Agreement. - -14. MISCELLANEOUS: This Agreement is the complete agreement between the parties respecting the Sample Code. This Agreement supersedes all prior or contemporaneous agreements or representations with regards to the Sample Code. If any term of this Agreement is found to be invalid or unenforceable, the surviving provisions shall remain effective. SAP's failure to enforce any right or provisions stipulated in this Agreement will not constitute a waiver of such provision, or any other provision of this Agreement. - - -v1.0-071618 + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright [yyyy] [name of copyright owner] + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. + +-------------------------------------------------------------------------- +APIs + +This project may include APIs to SAP or third-party products or services. +The use of these APIs, products and services may be subject to additional +agreements. In no event shall the application of the Apache Software +License, v.2 to this project grant any rights in or to these APIs, +products or services that would alter, expand, be inconsistent with, +or supersede any terms of these additional agreements. “API” means +application programming interfaces, as well as their respective +specifications and implementing code that allows other software +products to communicate with or call on SAP or third party products +or services (for example, SAP Enterprise Services, BAPIs, Idocs, RFCs +and ABAP calls or other user exits) and may be made available through +SAP or third party products, SDKs, documentation or other media. diff --git a/NOTICE b/NOTICE index 850cc7d..8adf051 100644 --- a/NOTICE +++ b/NOTICE @@ -1 +1 @@ -Copyright (c) 2018 SAP SE or an SAP affiliate company. All rights reserved. \ No newline at end of file +Copyright (c) 2018 SAP SE or an SAP affiliate company. All rights reserved. diff --git a/README.md b/README.md index cf85ffd..807bded 100644 --- a/README.md +++ b/README.md @@ -12,6 +12,9 @@ Take a look at the list below for links to all the separate samples. | ------------- | ------------- | ------------- | | Edge Streaming Aggregation | This sample creates two types of streaming aggregations: sliding (streaming time window - aggregation calculated in every incoming event) and jumping (database like time bucket - aggregation calculated end of every time bucket). The sliding aggregations can be sent to the Streaming Service rule engine | [streaming-aggregation](https://github.com/SAP/iot-edge-services-samples/tree/master/streaming-aggregation) | | Edge Persistence Aggregation | This sample demonstrates use of the Persistence Service Java API. This sample queries the Persistence Service at an interval. The query aggregates data stored in the Persistence Service, and feeds this data back into Edge Services. | [persistence-aggregation-max-temp](https://github.com/SAP/iot-edge-services-samples/tree/master/persistence-aggregation-max-temp) | +| Edge Machine learning | This sample demonstrates how a quality machine learning solution can be deployed on SAP Edge Services platform with an example of defective welding detection. | [edge-ml-welding-sound](https://github.com/SAP/iot-edge-services-samples/tree/master/edge-ml-welding-sound) | +| Edge Predictive Analytics 1 | This sample demonstrates how to implement a Predictive Analytics Service. The service integrates the usage of the Persistence Service Java APIs to get the data, and the Edge Service Configuration object to support dynamic configurations. It is using an external (not provided) JPMML library to compute the prediction,with the provided PMML model | [predictive-pmml](https://github.com/SAP/iot-edge-services-samples/tree/master/predictive-pmml) | +| Edge Predictive Analytics 2 | This sample demonstrates how to implement a Predictive Analytics Service. The service integrates the usage of the Persistence Service Java APIs to get the data, and the Edge Service Configuration object to support dynamic configurations. It is using an external python moduleto compute the prediction with an existing and already existing python model (optionally trained in the cloud) | [predictive-python](https://github.com/SAP/iot-edge-services-samples/tree/master/predictive-python) | ## How to obtain support @@ -29,6 +32,4 @@ Product Documentation for SAP Edge Services is available as follows: ## Copyright and License -Copyright (c) 2018 SAP SE or an SAP affiliate company. All rights reserved. - -License provided by [SAP SAMPLE CODE LICENSE AGREEMENT](https://github.com/SAP/iot-edge-services-samples/tree/master/LICENSE) +Copyright (c) 2018 SAP SE or an SAP affiliate company. All rights reserved. This file is licensed under the Apache Software License, version 2.0 except as noted otherwise in the [License](LICENSE) file. diff --git a/edge-ml-welding-sound/.gitignore b/edge-ml-welding-sound/.gitignore new file mode 100644 index 0000000..5509140 --- /dev/null +++ b/edge-ml-welding-sound/.gitignore @@ -0,0 +1 @@ +*.DS_Store diff --git a/edge-ml-welding-sound/LICENSE b/edge-ml-welding-sound/LICENSE new file mode 100644 index 0000000..8973bf7 --- /dev/null +++ b/edge-ml-welding-sound/LICENSE @@ -0,0 +1,41 @@ +SAP SAMPLE CODE LICENSE AGREEMENT + +Please scroll down and read the following SAP Sample Code License Agreement carefully ("Agreement"). By downloading, installing, or otherwise using the SAP sample code or any materials that accompany the sample code documentation (collectively, the "Sample Code"), You agree that this Agreement forms a legally binding agreement between You ("You" or "Your") and SAP SE, for and on behalf of itself and its subsidiaries and affiliates (as defined in Section 15 of the German Stock Corporation Act), and You agree to be bound by all of the terms and conditions stated in this Agreement. If You are trying to access or download the Sample Code on behalf of Your employer or as a consultant or agent of a third party (either "Your Company"), You represent and warrant that You have the authority to act on behalf of and bind Your Company to the terms of this Agreement and everywhere in this Agreement that refers to 'You' or 'Your' shall also include Your Company. If You do not agree to these terms, do not attempt to access or use the Sample Code. + +1. LICENSE: Subject to the terms of this Agreement, SAP grants You a non-exclusive, non-transferable, non-sublicensable, revocable, royalty-free, limited license to use, copy, and modify the Sample Code solely for Your internal business purposes. + +2. RESTRICTIONS: You must not use the Sample Code to: (a) impair, degrade or reduce the performance or security of any SAP products, services or related technology (collectively, "SAP Products"); (b) enable the bypassing or circumventing of SAP's license restrictions and/or provide users with access to the SAP Products to which such users are not licensed; or (c) permit mass data extraction from an SAP Product to a non-SAP Product, including use, modification, saving or other processing of such data in the non-SAP Product. Further, You must not: (i) provide or make the Sample Code available to any third party other than your authorized employees, contractors and agents (collectively, “Representatives”) and solely to be used by Your Representatives for Your own internal business purposes; ii) remove or modify any marks or proprietary notices from the Sample Code; iii) assign this Agreement, or any interest therein, to any third party; (iv) use any SAP name, trademark or logo without the prior written authorization of SAP; or (v) use the Sample Code to modify an SAP Product or decompile, disassemble or reverse engineer an SAP Product (except to the extent permitted by applicable law). You are responsible for any breach of the terms of this Agreement by You or Your Representatives. + +3. INTELLECTUAL PROPERTY: SAP or its licensors retain all ownership and intellectual property rights in and to the Sample Code and SAP Products. In exchange for the right to use, copy and modify the Sample Code provided under this Agreement, You covenant not to assert any intellectual property rights in or to any of Your products, services, or related technology that are based on or incorporate the Sample Code against any individual or entity in respect of any current or future SAP Products. + +4. SAP AND THIRD PARTY APIS: The Sample Code may include API (application programming interface) calls to SAP and third-party products or services. The access or use of the third-party products and services to which the API calls are directed may be subject to additional terms and conditions between you and SAP or such third parties. You (and not SAP) are solely responsible for understanding and complying with any additional terms and conditions that apply to the access or use of those APIs and/or third-party products and services. SAP does not grant You any rights in or to these APIs, products or services under this Agreement. + +5. FREE AND OPEN SOURCE COMPONENTS: The Sample Code may include third party free or open source components ("FOSS Components"). You may have additional rights in such FOSS Components that are provided by the third party licensors of those components. + +6. THIRD PARTY DEPENDENCIES: The Sample Code may require third party software dependencies ("Dependencies") for the use or operation of the Sample Code. These Dependencies may be identified by SAP in Maven POM files, documentation or by other means. SAP does not grant You any rights in or to such Dependencies under this Agreement. You are solely responsible for the acquisition, installation and use of such Dependencies. + +7. WARRANTY: +a) If You are located outside the US or Canada: AS THE SAMPLE CODE IS PROVIDED TO YOU FREE OF CHARGE, SAP DOES NOT GUARANTEE OR WARRANT ANY FEATURES OR QUALITIES OF THE SAMPLE CODE OR GIVE ANY UNDERTAKING WITH REGARD TO ANY OTHER QUALITY. NO SUCH WARRANTY OR UNDERTAKING SHALL BE IMPLIED BY YOU FROM ANY DESCRIPTION IN THE SAMPLE CODE OR ANY OTHER MATERIALS, COMMUNICATION OR ADVERTISEMENT. IN PARTICULAR, SAP DOES NOT WARRANT THAT THE SAMPLE CODE WILL BE AVAILABLE UNINTERRUPTED, ERROR FREE, OR PERMANENTLY AVAILABLE. ALL WARRANTY CLAIMS RESPECTING THE SAMPLE CODE ARE SUBJECT TO THE LIMITATION OF LIABILITY STIPULATED IN SECTION 8 BELOW. +b) If You are located in the US or Canada: THE SAMPLE CODE IS LICENSED TO YOU "AS IS", WITHOUT ANY WARRANTY, ESCROW, TRAINING, MAINTENANCE, OR SERVICE OBLIGATIONS WHATSOEVER ON THE PART OF SAP. SAP MAKES NO EXPRESS OR IMPLIED WARRANTIES OR CONDITIONS OF SALE OF ANY TYPE WHATSOEVER, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE. IN PARTICULAR, SAP DOES NOT WARRANT THAT THE SAMPLE CODE WILL BE AVAILABLE UNINTERRUPTED, ERROR FREE, OR PERMANENTLY AVAILABLE. YOU ASSUME ALL RISKS ASSOCIATED WITH THE USE OF THE SAMPLE CODE, INCLUDING WITHOUT LIMITATION RISKS RELATING TO QUALITY, AVAILABILITY, PERFORMANCE, DATA LOSS, AND UTILITY IN A PRODUCTION ENVIRONMENT. +c) For all locations: SAP DOES NOT MAKE ANY REPRESENTATIONS OR WARRANTIES IN RESPECT OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE. IN PARTICULAR, SAP DOES NOT WARRANT THAT THIRD-PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES WILL BE AVAILABLE, ERROR FREE, INTEROPERABLE WITH THE SAMPLE CODE, SUITABLE FOR ANY PARTICULAR PURPOSE OR NON-INFRINGING. YOU ASSUME ALL RISKS ASSOCIATED WITH THE USE OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES, INCLUDING WITHOUT LIMITATION RISKS RELATING TO QUALITY, AVAILABILITY, PERFORMANCE, DATA LOSS, UTILITY IN A PRODUCTION ENVIRONMENT, AND NON-INFRINGEMENT. IN NO EVENT WILL SAP BE LIABLE DIRECTLY OR INDIRECTLY IN RESPECT OF ANY USE OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES BY YOU. + +8. LIMITATION OF LIABILITY: +a) If You are located outside the US or Canada: IRRESPECTIVE OF THE LEGAL REASONS, SAP SHALL ONLY BE LIABLE FOR DAMAGES UNDER THIS AGREEMENT IF SUCH DAMAGE (I) CAN BE CLAIMED UNDER THE GERMAN PRODUCT LIABILITY ACT OR (II) IS CAUSED BY INTENTIONAL MISCONDUCT OF SAP OR (III) CONSISTS OF PERSONAL INJURY. IN ALL OTHER CASES, NEITHER SAP NOR ITS EMPLOYEES, AGENTS AND SUBCONTRACTORS SHALL BE LIABLE FOR ANY KIND OF DAMAGE OR CLAIMS HEREUNDER. +b) If You are located in the US or Canada: IN NO EVENT SHALL SAP BE LIABLE TO YOU, YOUR COMPANY OR TO ANY THIRD PARTY FOR ANY DAMAGES IN AN AMOUNT IN EXCESS OF $100 ARISING IN CONNECTION WITH YOUR USE OF OR INABILITY TO USE THE SAMPLE CODE OR IN CONNECTION WITH SAP'S PROVISION OF OR FAILURE TO PROVIDE SERVICES PERTAINING TO THE SAMPLE CODE, OR AS A RESULT OF ANY DEFECT IN THE SAMPLE CODE. THIS DISCLAIMER OF LIABILITY SHALL APPLY REGARDLESS OF THE FORM OF ACTION THAT MAY BE BROUGHT AGAINST SAP, WHETHER IN CONTRACT OR TORT, INCLUDING WITHOUT LIMITATION ANY ACTION FOR NEGLIGENCE. YOUR SOLE REMEDY IN THE EVENT OF BREACH OF THIS AGREEMENT BY SAP OR FOR ANY OTHER CLAIM RELATED TO THE SAMPLE CODE SHALL BE TERMINATION OF THIS AGREEMENT. NOTWITHSTANDING ANYTHING TO THE CONTRARY HEREIN, UNDER NO CIRCUMSTANCES SHALL SAP OR ITS LICENSORS BE LIABLE TO YOU OR ANY OTHER PERSON OR ENTITY FOR ANY SPECIAL, INCIDENTAL, CONSEQUENTIAL, OR INDIRECT DAMAGES, LOSS OF GOOD WILL OR BUSINESS PROFITS, WORK STOPPAGE, DATA LOSS, COMPUTER FAILURE OR MALFUNCTION, ANY AND ALL OTHER COMMERCIAL DAMAGES OR LOSS, OR EXEMPLARY OR PUNITIVE DAMAGES. + +9. INDEMNITY: You will fully indemnify, hold harmless and defend SAP against law suits based on any claim: (a) that any of Your products, services or related technology that are based on or incorporate the Sample Code infringes or misappropriates any patent, copyright, trademark, trade secrets, or other proprietary rights of a third party, or (b) related to Your alleged violation of the terms of this Agreement. + +10. EXPORT: The Sample Code is subject to German, EU and US export control regulations. You confirm that: a) You will not use the Sample Code for, and will not allow the Sample Code to be used for, any purposes prohibited by German, EU and US law, including, without limitation, for the development, design, manufacture or production of nuclear, chemical or biological weapons of mass destruction; b) You are not located in Cuba, Iran, Sudan, Iraq, North Korea, Syria, nor any other country to which the United States has prohibited export or that has been designated by the U.S. Government as a "terrorist supporting" country (any, an "US Embargoed Country"); c) You are not a citizen, national or resident of, and are not under the control of, a US Embargoed Country; d) You will not download or otherwise export or re-export the Sample Code, directly or indirectly, to a US Embargoed Country nor to citizens, nationals or residents of a US Embargoed Country; e) You are not listed on the United States Department of Treasury lists of Specially Designated Nationals, Specially Designated Terrorists, and Specially Designated Narcotic Traffickers, nor listed on the United States Department of Commerce Table of Denial Orders or any other U.S. government list of prohibited or restricted parties and f) You will not download or otherwise export or re-export the Sample Code, directly or indirectly, to persons on the above-mentioned lists. + +11. SUPPORT: SAP does not offer support for the Sample Code. + +12. TERM AND TERMINATION: You may terminate this Agreement by destroying all copies of the Sample Code in Your possession or control. SAP may terminate Your license to use the Sample Code immediately if You fail to comply with any of the terms of this Agreement, or, for SAP's convenience by providing you with ten (10) days written notice of termination. In case of termination or expiration of this Agreement, You must immediately destroy all copies of the Sample Code in your possession or control. In the event Your Company is acquired (by merger, purchase of stock, assets or intellectual property or exclusive license), or You become employed, by a direct competitor of SAP, then this Agreement and all licenses granted to You in this Agreement shall immediately terminate upon the date of such acquisition or change of employment. + +13. LAW/VENUE: +a) If You are located outside the US or Canada: This Agreement is governed by and construed in accordance with the laws of Germany without reference to its conflicts of law principles. You and SAP agree to submit to the exclusive jurisdiction of, and venue in, the courts located in Karlsruhe, Germany in any dispute arising out of or relating to this Agreement or the Sample Code. The United Nations Convention on Contracts for the International Sale of Goods shall not apply to this Agreement. +b) If You are located in the US or Canada: This Agreement shall be governed by and construed in accordance with the laws of the State of New York, USA without reference to its conflicts of law principles. You and SAP agree to submit to the exclusive jurisdiction of, and venue in, the courts located in New York, New York, USA in any dispute arising out of or relating to this Agreement or the Sample Code. The United Nations Convention on Contracts for the International Sale of Goods shall not apply to this Agreement. + +14. MISCELLANEOUS: This Agreement is the complete agreement between the parties respecting the Sample Code. This Agreement supersedes all prior or contemporaneous agreements or representations with regards to the Sample Code. If any term of this Agreement is found to be invalid or unenforceable, the surviving provisions shall remain effective. SAP's failure to enforce any right or provisions stipulated in this Agreement will not constitute a waiver of such provision, or any other provision of this Agreement. + + +v1.0-071618 diff --git a/edge-ml-welding-sound/NOTICE b/edge-ml-welding-sound/NOTICE new file mode 100644 index 0000000..2c848d0 --- /dev/null +++ b/edge-ml-welding-sound/NOTICE @@ -0,0 +1 @@ +Copyright (c) 2019 SAP SE or an SAP affiliate company. All rights reserved. diff --git a/edge-ml-welding-sound/README.md b/edge-ml-welding-sound/README.md new file mode 100644 index 0000000..b073f50 --- /dev/null +++ b/edge-ml-welding-sound/README.md @@ -0,0 +1,337 @@ +- [Machine learning on SAP Edge Services Platform](#machine-learning-on-sap-edge-services-platform) + - [Overview](#overview) + - [Product Documentation](#product-documentation) + - [Use case](#use-case) + - [Training a model: the working](#training-a-model-the-working) + - [System topology](#system-topology) + - [Pre-requisites](#pre-requisites) + - [SAP Data Intelligence](#sap-data-intelligence) + - [Deploying custom Machine learning docker image](#deploying-custom-machine-learning-docker-image) + - [Deploying Machine learning training pipeline](#deploying-machine-learning-training-pipeline) + - [Training a machine learning model](#training-a-machine-learning-model) + - [SAP Edge Services](#sap-edge-services) + - [Overview](#overview-1) + - [Requirements](#requirements) + - [OSGi Bundle Instructions](#osgi-bundle-instructions) + - [Building the Project](#building-the-project) + - [Installation](#installation) + - [After Deployment](#after-deployment) + - [Directory Structure](#directory-structure) + - [Edge ML Daemon (Python Script)](#edge-ml-daemon-python-script) + - [Limitations and possible improvements to the sample](#limitations-and-possible-improvements-to-the-sample) + - [Possible next version of the sample](#possible-next-version-of-the-sample) + - [A few words on effective training of the model](#a-few-words-on-effective-training-of-the-model) + - [How to obtain support](#how-to-obtain-support) + - [Copyright and License](#copyright-and-license) + +# Machine learning on SAP Edge Services Platform + +## Overview +This sample demonstrates one of the ways a quality machine learning solution can be deployed on SAP Edge Services platform, which typically sits in sites without reliable connectivity to internet. This brings a set of challenges. For example +* Executing machine learning inferencing in real time when suitable end-point in the cloud is not reachable +* Leveraging powerful machine learning capabilities in a remote location with full training-deployment life cycle support +* It would be expensive to stream of all of the sound data into cloud even if connectivity was available. + +A machine learning training is often a data intensive operation requiring scalable capabilities and access to data e.g from SAP systems, telemetry and external data. These tasks are admittedly difficult to manage economically in remote sites. Furthermore system must keep running over extended periods of time in an economical manner. + +We still want to have best of the both worlds i.e. real time infererencing in remote sites and a cloud trained machine learning model built using best of breed technologies at scale. This sample should be seen in that context. + +## Product Documentation + +Product Documentation for SAP Edge Services is available as follows: + +[SAP Edge Services, cloud edition](https://help.sap.com/viewer/p/EDGE_SERVICES) + +[SAP Edge Services, on-premise edition](https://help.sap.com/viewer/p/SAP_EDGE_SERVICES_OP) + +## Use case +The sample covers a usecase to detect defective welding jobs at a customer in real time. + +Studies have shown that welding sounds can indicate defects in welding. Therefore sound samples will be used to train a machine learning model leading to an un-intrusive detection system. +Also as part of post processing, a service ticket will be created for the technicians in response to defective welding events. + + +![Overview](doc/overview.png) + +From an algorithmmic point of view, the problem statement in brief is, given a sample of sound in computer readable format such as .wav file determine if it represents a defective welding or not. + + +## Training a model: the working +It is assumed audio samples of welding are available: normal and defective, organized in labelled folder like so: + +![](doc/training_data.png) + +It turns out the faulty and welding sounds differ in the way they sound: the defective welding sound is unstable with numerous spikes. + +First a few words on digitized sound. The sound files in .wav formats are digitized by sampling at a certain rate, say 44 k Hz. Each of these sound sample represents a series of amplitudes of the sound wave at a point in time and range of levels of sound captured depends on the number of bits used. For example 16 bit sound sample can capture 65536 levels of detail at each instant. + +Now that is an amplitude perspective, which is not particularly helpful for locating inherant patterns. A better way to represent sound for that purpose would be by articulating frequencies contained in sound in time. Spectrograms are such representations and MFCC spectrgrams in particular make patterns more distinct by applying a suitable vertical scaling. You can see that in the picture below. +Therefore MFCC spectograms will be used in the sample for training machine a learning model. + +![](doc/sound_visualization.png) + + +Essentially our traning pipeline will +* Read sound files from the training archive +* Break them into one second splits +* Extract MFCC spectogram for each split +* Train Convolutional Neural Network(CNN) model of Tensorflow framework +* Export the trained model + +Choice of Convolution Neural Network is based on the fact that they are good at detecting pattern in spatial data(e.g. two dimensional data: time vs frequency vectors) + +The sample has been tested against real welding data for accuracy. Nonetheless, after customizing this sample to your own implementation, make sure to thoroughly test before deploying to production environments. + + +The sample consists of two parts +* Training artifacts for SAP Data Intelligence in Cloud +* Custom machine learning service for SAP Edge Services, which uses a pre-trained machine learning model in Edge sites to detect defective welding. + + + +## System topology +Sample relies on pipelining capabilities SAP Data Intelligence, which also +* allows possiblity to provide custom runtime for machine learning as Docker images +* connectivity to various systems: hyper-scalar and SAP etc. + +After a model is trained by SAP Data Intelligence, freshly trained model is automatically downloaded by SAP Edge services + +For the purpose of this demo we will use Amazon S3 buckets as staging area for training data and machine learning models for exchange between SAP Data Intelligence and SAP Edge services. + +The trained machine learning model is downloaded automatically and periodicially by SAP Edge services. + +So let's dive into the details of artifacts and code needed to deploy the samples in your SAP Data Intelligence and SAP Edge Services environment. + +The sections below describe how to build, deploy and run this sample. + +## Pre-requisites +An Amazon S3 bucket and a user with read/write credentials (note the S3 access key/secret, you will need those) where you have a bucket by name 'edgepoc' with two folders 'data' and 'train' for storing training data and trained model respectively. + + +![](doc/s3_bucket.png) + + +## SAP Data Intelligence +There are primarily two artifacts to be deployed +* Custom machine learning docker image containing audio and ML packages in which our model will be trained +* The machine learning pipeline + +In order to deploy above pipeline artifacts, open the SAP Data Intelligence Pipeline modeler. + +![](doc/di_pipeline.png) + + +### Deploying custom Machine learning docker image +This is a docker image with +* Python 3 +* Keras/Tensorflow +* Audio processing libraries / librosa + +Although you will need above components, this is only an example image, you can buid your own. The important is +using correct set of Tags when creating docker image in SAP Data Intelligence to be recognized by vFlow operators. Here are the steps you will +* Choose the 'Repository' tab on the SAP Data Intelligence pipeline and open 'dockerfiles' folder +* Click down arrow next to the '+' button on the top right of panel as shown below and choose 'Create Docker File' + +![](doc/create_docker1.png) + + +* Copy and paste content content of [this](src/di/vflow/dockerfiles/edge2/Dockerfile) Docker file on the middle panel and configure the pipeline after opening the properties panel on the right: add tags as shown below (use '+' button under tags), save and give it a name e.g. 'edge2' + +![](doc/create_docker2.png) + +* Choose right arrow button ![](doc/create_docker3_build.png) to the left of the Save button in the toolbar located top right corner to build the docker image. This should take a few minutes, after which your Docker image is ready + + +### Deploying Machine learning training pipeline +This is a pipeline consisting 4 operators: Training data importer, Splitter, Trainer, Model Exporter. Here are the steps to deploy it + +* Copy contents of [graph.json](src/di/vflow/graphs/welding_audio_trainer/graph.json) to clipboard +* Create a graph on the pipeline modeler by selecting the 'Graphs' panel on the left and choosing the '+' button on same panel. Now go to json tab in content panel and paste the clipboard contents +* Enter the S3 credentials in the Data Importer and Model Exporter source code: search/replace ''/'' with your AWS acess key and secret respectively. You can do this directly in the json +* Switch back to the 'Diagram' perspective (see tab label near top/right corner of content panel) +* Click pipeline properties icon to open the properties panel, click blue area of the pipeline in the modeler, select and choose the name of custom ML docker image that you want to use for this pipeline: e.g 'edgeml2' you just created + +![](doc/create_pipeline4_docker.png) + +* Hit save and give it a name + +Your pipeline is now ready + +### Training a machine learning model +* Compress the training folder, give it a name (eg. train.zip) and upload it to 'data' folder of the S3 bucket you just created + ![](doc/upload_training_data1_zip.png) +* Open the pipeline modeler and in the DataImport operator property 'dataset', enter the name of the compressed archive you just uploaded. eg. data/train.zip. Also enter the name of the model to be exported in the ModelExport operator as 'audio_detectx' ('model' property) +* Start the pipeline and monitor the various terminals to observe activities in the running operators as they occur + +The training pipeline will automatically upload the the trained model to the S3 bucket you setup after the training is finished + +## SAP Edge Services + +### Overview + +The current implementation of this sample is built as a \*.JAR file that can be run at the edge inside of an OSGi runtime, such as the SAP IoT Service Edge Platform. This bundle can be managed and deployed to a running instance of SAP IoT Edge Platform on the edge machine by SAP Edge Services' Policy Service. + +### Requirements +Doc: https://help.sap.com/viewer/a1c5f93025864b6f9a867a12caf6dd06/1911/en-US/13c486e5502f46d8a482aeeb450ea932.html +Prerequisites: +- [JDK](https://www.oracle.com/technetwork/java/jdk8-downloads-2133151.html) 8 64-bit +- [Python 3.*](https://www.python.org/downloads/) + - <**! TODO:** *Insert list of required Python packages*> +- [Maven 3.6](https://maven.apache.org/download.cgi) +- [Git](https://git-scm.com/downloads) + +### OSGi Bundle Instructions + +The current implementation of the Edge Machine Learning proof of concept (PoC) is built as a \*.JAR file that can be run at the edge inside of an OSGi runtime. This bundle can be managed and deployed by SAP Edge Services' Policy Service. + +### Building the Project + +1. Clone the project. +2. Install python modules and sound processing packages needed for inferencing +``` shell + pip install boto3 + pip install librosa + pip install matplotlib + pip install tensorflow==2.0.0a0 + pip install tensorflow==1.9 numpy==1.16.4 + pip install keras==2.2.0 + pip install pydub + pip install libmagic + apt-get update && apt-get install -y libsndfile1 + apt-get update && apt-get install -y ffmpeg python3-magic + git clone https://github.com/tyiannak/pyAudioAnalysis.git + pip install -e pyAudioAnalysis/ +``` + +3. Tweak inferencing script located here located at [src/edge/src/main/resources/bin/](src/edge/src/main/resources/bin/edge_ml_daemon.py) as necessary + + * Please replace ''/'' with your AWS acess key and secret respectively + * Review function "send_data_to_gateway" and adjust the endpoints and other constants as necessary => +` + device_id = "MLDevice01" + sensor_id = "ML_Test_Sensor" + sensor_type_alternate_id = "67" + capability_alternate_id = "inf01"` + + +4. Navigate to the [`/src/edge/`](src/edge) directory. Build using Maven from the command line: + ``` + $ mvn clean package + ``` + +5. You should see a message like this and you should see an `EdgeML-1.0.0.jar` file in the target subdirectory: + ``` + [INFO] ------------------------------------------------------------------------ + [INFO] BUILD SUCCESS + [INFO] ------------------------------------------------------------------------ + [INFO] Total time: 5.828 s + [INFO] Finished at: 2019-10-10T16:19:05-04:00 + [INFO] ------------------------------------------------------------------------ + ``` + +### Installation +Prerequisites: +- Set up an SAP IoT Service Edge Platform on the target machine. For instructions, please refer to the following JAM page: https://jam4.sapjam.com/wiki/show/XaRvWNbDMggAVtAdhpQ1qp +- A built `EdgeML-.jar` file +- Edge Services Streaming Service is deployed to and running on the Edge Platform + + +1. Log into SAP Edge Services Policy Service. + +2. Click on the **Settings** tile. In the **Settings** tile: + - Set the SAP IoT Service credentials to match the IoT Service instance that your edge platform is registered with (please refer to the above JAM link if you have no done so already!). + - Ensure that the "*Allow upload of Custom Services (OSGi bundles) to my IoT Edge Platforms*" option is checked. + - Save your settings. + +3. Return to the launchpad. + +4. Click on the **Edge Services Management** tile. In the sandwich menu (left-hand side navigation), click on **Services**. In the **Services** menu, click on the '+' button to upload a new Custom Service. + +5. Enter the following: + + | Field | Value | + |---------------------|----------------------------------------| + | Name | EdgeML | + | Configuration Topic | MLPOC | + | File Name | *(Upload your EdgeML-1.0.0.jar file)* | + | Service Type | Custom | + + Click **OK**. + +6. In the sandwich menu (left-hand side navigation), click on **Groups and Gateways**. Find your edge platform in the list. Select it. Click on the **Services** tab of gateway. Click '+' to deploy a service to your gateway. Select the **EdgeML** service. This will deploy the OSGi bundle to your gateway. + +### After Deployment + +This section describes the behavior of the EdgeML service at the edge after deployment. + +#### Directory Structure + +The EdgeML custom service will set up its own directory structure under the `../edgeservices/` directory, at the installation location of the SAP IoT Edge Platform. + +``` ++ edgeservices/ + + ml-poc/ + + bin/ + + image/ + + input/ + + processed/ + + logs/ + + output/ + + model/ + + split/ + + processed/ +``` + +* The `bin/` directory contains the Python script for the Edge ML Daemon.. +* The `input/` directory is the location in which the input sound files should be placed in order to be picked up by the Edge ML Daemon, once it is running (see next section). +* The `split/` directory will contain the sound files after pre-processing, on which the Edge ML Daemon will perform inference. +* The `output/` directory will contain the output of the inference. The inference results will also be output to the Edge Platform console. +* The `model/` directory contains the latest model downloaded from the S3 bucket. + +#### Edge ML Daemon (Python Script) + +The Edge ML Daemon Python script handles the necessary pre-processing steps for the current version of the sample. It can be found and modified in this repository here: [/src/edge/src/main/resources/bin/](src/edge/src/main/resources/bin/). + +This script is packaged with the EdgeML bundle and performs the necessary pre-processing steps of the sound files, polling the `input/` directory for input. It also updates the model at a set interval, performs the splitting of sound files into one-second clips, and outputs the inferences to standard output and the `output/` directory. + + +The Edge ML Daemon's life-cycle is managed by the EdgeML OSGi bundle. The daemon is started by inputting the following command to the gateway: +``` + g! startEdgeDaemon +``` + +The Edge ML Daemon can be stopped (when running) with the following command to the gateway: +``` + g! stopEdgeDaemon +``` + +## Limitations and possible improvements to the sample +There are some limitations in the sample +* A number of internal variables in the pipeline and the inferencing scripts could be turned into configurations for productive use. In particular credentials must be secured. +* Inferencing script can improved to dyamically map welding sounds to specific welding machine and assets (serial number, model etc) +* Others + +## Possible next version of the sample +* Demonstrate scalability aspects of SAP Data Intelligence to handle large training workloads, also introduce parallel processing for inferencing +* Provide addtional configuration and tuning possibilites in the training pipeline and operators. eg. credentials, audio split window size +* Ability to expoert inferencing script from SAP Data Intelligence +* Although sample scripts include InceptionV3 model inferencing, that topic requires a detailed treatment another time + + +## A few words on effective training of the model +The model was tested to perform well with tested data samples. +Although in principle any welding mal-function can be detected by exhaustive training with large amount of samples, a few simple tricks can improve the effectiveness of the system in a practical manner without expensive undertaking +* Make sure all known defective conditions are covered in the training. +* Pay particular assumption to any mis-predictions in an ongoing basis and retrain the model based on those samples with corrected labels. This should lead to continuous improvement of the accuracy over time. +* A "moving window" inferencing approach could be used to improve the accuracy of the model and elmiminate "noise" in predictions: for example make a final infernence based on an aggregate of individual inferences for last N audio segements + +With SAP Data Intelligence it is possible automate a lot of such activities, but that is a topic for another time. + + +## How to obtain support +These samples are provided "as-is" basis with detailed documentation on how to use them. + +## Copyright and License +Copyright (c) 2019 SAP SE or an SAP affiliate company. All rights reserved. + +License provided by [SAP SAMPLE CODE LICENSE AGREEMENT](./LICENSE) diff --git a/edge-ml-welding-sound/doc/create_docker1.png b/edge-ml-welding-sound/doc/create_docker1.png new file mode 100644 index 0000000..c701f53 Binary files /dev/null and b/edge-ml-welding-sound/doc/create_docker1.png differ diff --git a/edge-ml-welding-sound/doc/create_docker2.png b/edge-ml-welding-sound/doc/create_docker2.png new file mode 100644 index 0000000..f182a7f Binary files /dev/null and b/edge-ml-welding-sound/doc/create_docker2.png differ diff --git a/edge-ml-welding-sound/doc/create_docker3_build.png b/edge-ml-welding-sound/doc/create_docker3_build.png new file mode 100644 index 0000000..b903d75 Binary files /dev/null and b/edge-ml-welding-sound/doc/create_docker3_build.png differ diff --git a/edge-ml-welding-sound/doc/create_pipeline1.png b/edge-ml-welding-sound/doc/create_pipeline1.png new file mode 100644 index 0000000..7b6aa80 Binary files /dev/null and b/edge-ml-welding-sound/doc/create_pipeline1.png differ diff --git a/edge-ml-welding-sound/doc/create_pipeline2.png b/edge-ml-welding-sound/doc/create_pipeline2.png new file mode 100644 index 0000000..7131450 Binary files /dev/null and b/edge-ml-welding-sound/doc/create_pipeline2.png differ diff --git a/edge-ml-welding-sound/doc/create_pipeline3_configure.png b/edge-ml-welding-sound/doc/create_pipeline3_configure.png new file mode 100644 index 0000000..72c3774 Binary files /dev/null and b/edge-ml-welding-sound/doc/create_pipeline3_configure.png differ diff --git a/edge-ml-welding-sound/doc/create_pipeline3_docker.png b/edge-ml-welding-sound/doc/create_pipeline3_docker.png new file mode 100644 index 0000000..293c06c Binary files /dev/null and b/edge-ml-welding-sound/doc/create_pipeline3_docker.png differ diff --git a/edge-ml-welding-sound/doc/create_pipeline4_docker.png b/edge-ml-welding-sound/doc/create_pipeline4_docker.png new file mode 100644 index 0000000..41d10e2 Binary files /dev/null and b/edge-ml-welding-sound/doc/create_pipeline4_docker.png differ diff --git a/edge-ml-welding-sound/doc/di_dashboard.png b/edge-ml-welding-sound/doc/di_dashboard.png new file mode 100644 index 0000000..3e211d6 Binary files /dev/null and b/edge-ml-welding-sound/doc/di_dashboard.png differ diff --git a/edge-ml-welding-sound/doc/di_pipeline.png b/edge-ml-welding-sound/doc/di_pipeline.png new file mode 100644 index 0000000..6d27bac Binary files /dev/null and b/edge-ml-welding-sound/doc/di_pipeline.png differ diff --git a/edge-ml-welding-sound/doc/overview.png b/edge-ml-welding-sound/doc/overview.png new file mode 100644 index 0000000..8d4fee7 Binary files /dev/null and b/edge-ml-welding-sound/doc/overview.png differ diff --git a/edge-ml-welding-sound/doc/s3_bucket.png b/edge-ml-welding-sound/doc/s3_bucket.png new file mode 100644 index 0000000..54104fa Binary files /dev/null and b/edge-ml-welding-sound/doc/s3_bucket.png differ diff --git a/edge-ml-welding-sound/doc/sound_visualization.png b/edge-ml-welding-sound/doc/sound_visualization.png new file mode 100644 index 0000000..2c05c5f Binary files /dev/null and b/edge-ml-welding-sound/doc/sound_visualization.png differ diff --git a/edge-ml-welding-sound/doc/training_data.png b/edge-ml-welding-sound/doc/training_data.png new file mode 100644 index 0000000..07e758b Binary files /dev/null and b/edge-ml-welding-sound/doc/training_data.png differ diff --git a/edge-ml-welding-sound/doc/upload_training_data1_zip.png b/edge-ml-welding-sound/doc/upload_training_data1_zip.png new file mode 100644 index 0000000..400ccc9 Binary files /dev/null and b/edge-ml-welding-sound/doc/upload_training_data1_zip.png differ diff --git a/edge-ml-welding-sound/src/di/vflow/dockerfiles/edge2/Dockerfile b/edge-ml-welding-sound/src/di/vflow/dockerfiles/edge2/Dockerfile new file mode 100644 index 0000000..ecd2328 --- /dev/null +++ b/edge-ml-welding-sound/src/di/vflow/dockerfiles/edge2/Dockerfile @@ -0,0 +1,160 @@ +FROM buildpack-deps:buster + +# ensure local python is preferred over distribution python +ENV PATH /usr/local/bin:$PATH + +# http://bugs.python.org/issue19846 +# At the moment, setting "LANG=C" on a Linux system *fundamentally breaks Python 3*. +ENV LANG C.UTF-8 + +RUN apt-get update && apt-get install -y --no-install-recommends \ + tk-dev \ + && rm -rf /var/lib/apt/lists/* + +ENV GPG_KEY 0D96DF4D4110E5C43FBFB17F2D347EA6AA65421D +ENV PYTHON_VERSION 3.6.9 + +RUN set -ex \ + \ + && wget -O python.tar.xz "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz" \ + && wget -O python.tar.xz.asc "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz.asc" \ + && export GNUPGHOME="$(mktemp -d)" \ + && gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$GPG_KEY" \ + && gpg --batch --verify python.tar.xz.asc python.tar.xz \ + && { command -v gpgconf > /dev/null && gpgconf --kill all || :; } \ + && rm -rf "$GNUPGHOME" python.tar.xz.asc \ + && mkdir -p /usr/src/python \ + && tar -xJC /usr/src/python --strip-components=1 -f python.tar.xz \ + && rm python.tar.xz \ + \ + && cd /usr/src/python \ + && gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)" \ + && ./configure \ + --build="$gnuArch" \ + --enable-loadable-sqlite-extensions \ + --enable-optimizations \ + --enable-shared \ + --with-system-expat \ + --with-system-ffi \ + --without-ensurepip \ + && make -j "$(nproc)" \ +# setting PROFILE_TASK makes "--enable-optimizations" reasonable: https://bugs.python.org/issue36044 / https://github.com/docker-library/python/issues/160#issuecomment-509426916 + PROFILE_TASK='-m test.regrtest --pgo \ + test_array \ + test_base64 \ + test_binascii \ + test_binhex \ + test_binop \ + test_bytes \ + test_c_locale_coercion \ + test_class \ + test_cmath \ + test_codecs \ + test_compile \ + test_complex \ + test_csv \ + test_decimal \ + test_dict \ + test_float \ + test_fstring \ + test_hashlib \ + test_io \ + test_iter \ + test_json \ + test_long \ + test_math \ + test_memoryview \ + test_pickle \ + test_re \ + test_set \ + test_slice \ + test_struct \ + test_threading \ + test_time \ + test_traceback \ + test_unicode \ + ' \ + && make install \ + && ldconfig \ + \ + && find /usr/local -depth \ + \( \ + \( -type d -a \( -name test -o -name tests \) \) \ + -o \ + \( -type f -a \( -name '*.pyc' -o -name '*.pyo' \) \) \ + \) -exec rm -rf '{}' + \ + && rm -rf /usr/src/python \ + \ + && python3 --version + +# make some useful symlinks that are expected to exist +RUN cd /usr/local/bin \ + && ln -s idle3 idle \ + && ln -s pydoc3 pydoc \ + && ln -s python3 python \ + && ln -s python3-config python-config + +# if this is called "PIP_VERSION", pip explodes with "ValueError: invalid truth value ''" +ENV PYTHON_PIP_VERSION 19.2.3 +# https://github.com/pypa/get-pip +ENV PYTHON_GET_PIP_URL https://github.com/pypa/get-pip/raw/309a56c5fd94bd1134053a541cb4657a4e47e09d/get-pip.py +ENV PYTHON_GET_PIP_SHA256 57e3643ff19f018f8a00dfaa6b7e4620e3c1a7a2171fd218425366ec006b3bfe + +RUN set -ex; \ + \ + wget -O get-pip.py "$PYTHON_GET_PIP_URL"; \ + echo "$PYTHON_GET_PIP_SHA256 *get-pip.py" | sha256sum --check --strict -; \ + \ + python get-pip.py \ + --disable-pip-version-check \ + --no-cache-dir \ + "pip==$PYTHON_PIP_VERSION" \ + ; \ + pip --version; \ + \ + find /usr/local -depth \ + \( \ + \( -type d -a \( -name test -o -name tests \) \) \ + -o \ + \( -type f -a \( -name '*.pyc' -o -name '*.pyo' \) \) \ + \) -exec rm -rf '{}' +; \ + rm -f get-pip.py + + +################# +RUN python3.6 -m ensurepip + +RUN python3.6 -m pip install tornado==5.0.2 +RUN python3.6 -m pip install boto3 +RUN python3.6 -m pip install requests + +RUN python3.6 -m pip install kafka-python +RUN python3.6 -m pip install confluent-kafka + +RUN python3.6 -m pip install kafka-utils + +RUN python3.6 -m pip install pprint + +RUN python3.6 -m pip install networkx +RUN python3.6 -m pip install gunicorn + +RUN python3.6 -m pip install librosa +RUN python3.6 -m pip install pandas==0.24.2 +RUN python3.6 -m pip install pyhdb==0.3.4 + +RUN python3.6 -m pip install avro-python3 +RUN python3.6 -m pip install matplotlib +RUN python3.6 -m pip install pillow + +RUN python3.6 -m pip install tensorflow==1.9 numpy==1.16.4 + +RUN python3.6 -m pip install keras==2.2.0 +RUN python3.6 -m pip install pydub +RUN python3.6 -m pip install libmagic + +RUN apt-get update && apt-get install -y libsndfile1 +RUN apt-get update && apt-get install -y ffmpeg python3-magic + +RUN pip install scipy sklearn hmmlearn simplejson eyed3 +RUN git clone https://github.com/tyiannak/pyAudioAnalysis.git +RUN pip install -e pyAudioAnalysis/ diff --git a/edge-ml-welding-sound/src/di/vflow/dockerfiles/edge2/Tags.json b/edge-ml-welding-sound/src/di/vflow/dockerfiles/edge2/Tags.json new file mode 100644 index 0000000..1ec163a --- /dev/null +++ b/edge-ml-welding-sound/src/di/vflow/dockerfiles/edge2/Tags.json @@ -0,0 +1,6 @@ +{ + "edgeml2": "", + "opensuse": "", + "python36": "", + "tornado": "5.0.2" +} \ No newline at end of file diff --git a/edge-ml-welding-sound/src/di/vflow/graphs/welding_audio_trainer/graph.json b/edge-ml-welding-sound/src/di/vflow/graphs/welding_audio_trainer/graph.json new file mode 100644 index 0000000..8608765 --- /dev/null +++ b/edge-ml-welding-sound/src/di/vflow/graphs/welding_audio_trainer/graph.json @@ -0,0 +1,307 @@ +{ + "properties": {}, + "description": "Welding audio detection trainer", + "processes": { + "python3operator2": { + "component": "com.sap.system.python3Operator", + "metadata": { + "label": "DatasetImport", + "x": 24, + "y": 200, + "height": 80, + "width": 120, + "extensible": true, + "config": { + "script": "\nimport boto3\n\nimport fnmatch\nimport os\nimport os.path\nimport re\nimport time\nimport zipfile\nimport shutil\n \nfrom pydub import AudioSegment\nfrom pydub.utils import make_chunks\n\nretrain = True\nmodel_dir_name = \"/tmp/\"\nhome_dir = '/tmp/' \n\ndef out_debug(message):\n api.send(\"debug\", message)\n \n \ndef trigger_download():\n dataset = api.config.dataset\n out_debug( \"Volume contents {}\".format ( str(get_files(model_dir_name, ['*.*'])) ) )\n out_debug( \"Downloading training dataset {}\".format(dataset) )\n \n out_debug(\"Establishing connection to AWS S3 ...\")\n conn = get_aws_connection()\n out_debug(\"AWS connection: Done, starting download\")\n \n try:\n download_data(conn, 'edgepoc', dataset, home_dir)\n \n out_debug( \"Volume contents {}\".format ( str(get_files(model_dir_name, ['*.*'])) ) )\n api.send(\"debug\", \"Downloading complete, exploding data\") \n except Exception as e:\n out_debug(\"Error downloading training data {}\".format(str(e)))\n \n zip_filename = dataset.split('/')[-1]\n \n try:\n with zipfile.ZipFile(home_dir+zip_filename, 'r') as zip_ref:\n zip_ref.extractall(home_dir)\n \n out_debug( \"Training data exploded\")\n out_debug( \"Volume contents {}\".format ( str(get_files(model_dir_name, ['*.*'])) ) )\n \n api.send(\"output\", \"training_data_ready\")\n \n \n except Exception as e:\n api.send(\"debug\", str(e))\n \n out_debug(\"No training data found\")\n \n \ndef split_aut_files():\n\n for fname in get_files(home_dir+'/input/good/'):\n split_file(fname, home_dir+'split/good/')\n api.send(\"debug\", \"Splitting /input/good/{}\".format(fname))\n \n api.send(\"debug\", \"test2\") \n for fname in get_files(home_dir+'/input/bad/'):\n split_file(fname, home_dir+'split/bad/')\n api.send(\"debug\", \"Splitting /input/good/{}\".format(fname))\n \n for fname in get_files(home_dir+'/input/ignore/'):\n split_file(fname, home_dir+'split/ignore/')\n api.send(\"debug\", \"Splitting /input/good/{}\".format(fname))\n \n \n\ndef status():\n for fname in get_files(home_dir+'/input/good/'):\n api.send(\"debug\", str(fname))\n \n \n\napi.add_timer(\"36000s\", trigger_download)\n\n\n##### reuslable modules\ndef get_files (d, includes = ['*.json', '*.h5', '*.mp3', '*.wav'] ):\n excludes = ['/home/ram/doc'] # for dirs and files\n # transform glob patterns to regular expressions\n includes = r'|'.join([fnmatch.translate(x) for x in includes])\n excludes = r'|'.join([fnmatch.translate(x) for x in excludes]) or r'$.'\n files = []\n for root, dirs, files in os.walk(d):\n api.send (\"debug\", \"dir {} dir {} files {}\".format(d, dirs, files))\n # exclude dirs\n dirs[:] = [os.path.join(root, d) for d in dirs]\n dirs[:] = [d for d in dirs if not re.match(excludes, d)]\n \n # exclude/include files\n files = [os.path.join(root, f) for f in files]\n files = [f for f in files if not re.match(excludes, f)]\n files = [f for f in files if re.match(includes, f)]\n\n return files\n \n\n\ndef get_aws_connection():\n boto3.setup_default_session(region_name='eu-central-1')\n s3 = boto3.client(\n 's3',\n # Hard coded strings as credentials, not recommended.\n aws_access_key_id='',\n aws_secret_access_key='',\n region_name = 'eu-central-1'\n ) \n return s3\n \n\n\ndef download_data(s3_client, bucket, obj_name, ouput_dir):\n s3_client.download_file(bucket, obj_name, ouput_dir+obj_name.split('/')[-1])\n\n return True\n\n\n \n###\n ", + "dataset": "data/train_demo2.zip" + }, + "additionaloutports": [ + { + "name": "output", + "type": "string" + }, + { + "name": "debug", + "type": "string" + } + ] + } + }, + "python3operator4": { + "component": "com.sap.system.python3Operator", + "metadata": { + "label": "Audio_Splitter", + "x": 208.99999904632568, + "y": 185.50000023841858, + "height": 80, + "width": 120, + "extensible": true, + "config": { + "script": "from pydub import AudioSegment\nfrom pydub.utils import make_chunks\nimport shutil, sys\nimport os, fnmatch, re\nimport time\n\nmodel_dir_name = \"/tmp/\"\nhome_dir = '/tmp/' \n\ndef on_input(data):\n out_debug(str(get_group_index()))\n out_debug (\"Input ready, splitting audio \")\n\n split_files('/tmp/train/', '/tmp/split/')\n \n api.send(\"output\", str(data))\n\napi.set_port_callback(\"input\", on_input)\n\n\ndef split_file (sound_file, output_dir):\n if not os.path.exists(output_dir):\n # creating diretory for split files if necessary\n os.makedirs(output_dir)\n\n if sound_file.endswith('.wav'): \n myaudio = AudioSegment.from_wav(sound_file)\n else:\n myaudio = AudioSegment.from_mp3(sound_file)\n \n chunks = make_chunks(myaudio, 1000) \n fname = sound_file.split('/')[-1]\n for i, chunk in enumerate(chunks):\n chunk.export(\"{}{}{}.wav\".format(output_dir,fname, i))\n out_debug (\"{}{}{}.wav\".format(output_dir,fname, i))\n\n \ndef split_files(src, dst):\n start = time.time()\n count_f, count_s = 0, 0\n try:\n dirs = list(os.walk(src))[0][1]\n \n for d in dirs:\n out_debug ('Reading directory {}'.format(d))\n out_debug ('Recreating output directory {}'.format((dst+d)) )\n rmdir(dst+d)\n mkdir(dst+d)\n for sound_file in get_files(src+d):\n count_f += 1\n out_debug ('Splitting dir {}, file {}'.format(dst+d+'/', sound_file))\n split_file(sound_file, dst+d+'/')\n except Exception as e:\n out_debug(\"No files to process: {}\".format(str(e)))\n end = time.time()\n out_debug (\"{} files processed in {} seconds\".format(count_f, end-start))\n \ndef get_files (d, includes = ['*.wav', '*.mp3'] ):\n excludes = ['/home/ram/doc'] # for dirs and files\n \n # transform glob patterns to regular expressions\n includes = r'|'.join([fnmatch.translate(x) for x in includes])\n excludes = r'|'.join([fnmatch.translate(x) for x in excludes]) or r'$.'\n \n files = []\n for root, dirs, files in os.walk(d):\n dirs[:] = [os.path.join(root, d) for d in dirs]\n dirs[:] = [d for d in dirs if not re.match(excludes, d)]\n \n # exclude/include files\n files = [os.path.join(root, f) for f in files]\n files = [f for f in files if not re.match(excludes, f)]\n files = [f for f in files if re.match(includes, f)]\n \n return files\n \ndef rmdir(dir_path):\n try:\n shutil.rmtree(dir_path)\n except:\n print('Error deleteing directory {}: {}'.format(dir_path, sys.exc_info()[0]))\n\n\ndef mkdir(dir_path):\n try:\n print(\"create directory {}\".format(dir_path))\n os.makedirs(dir_path)\n except FileExistsError:\n print('Error creating directory {}: {}'.format(dir_path,sys.exc_info()[0]))\n\ndef get_group_index():\n ret = 0\n try:\n ret = int(api.group_id.split('-')[1])\n out_debug(str(api))\n except Exception as e:\n out_debug(\"get_group_index: {}. Defaulting to 0\".format(str(e)))\n return ret\n \n\n \ndef out_debug(message):\n api.send(\"debug\", message)\n \n " + }, + "additionalinports": [ + { + "name": "input", + "type": "string" + } + ], + "additionaloutports": [ + { + "name": "output", + "type": "string" + }, + { + "name": "debug", + "type": "string" + } + ] + } + }, + "python3operator1": { + "component": "com.sap.system.python3Operator", + "metadata": { + "label": "Train", + "x": 393.99999809265137, + "y": 177.00000023841858, + "height": 80, + "width": 120, + "extensible": true, + "config": { + "script": "from __future__ import print_function\nimport tensorflow as tf\nimport numpy as np\nimport keras\n\nfrom keras.preprocessing.image import ImageDataGenerator, array_to_img, img_to_array, load_img\nfrom keras.models import Sequential\nfrom keras.layers import Activation, Dropout, Flatten, Dense, LSTM, Activation, Dense, Dropout, Input, Embedding, TimeDistributed, Conv2D, MaxPooling2D\nfrom keras import backend as K\nfrom keras.preprocessing import image\n\nfrom keras.models import Sequential\nfrom keras.layers import Dense, Dropout, LSTM\nimport matplotlib.pyplot as plt\n\nfrom keras.utils.np_utils import to_categorical # convert to one-hot-encoding\nfrom sklearn.utils import shuffle\nfrom keras import regularizers\nfrom keras.datasets import cifar10\nfrom keras.preprocessing.image import ImageDataGenerator\nfrom keras.models import Sequential\nfrom keras.optimizers import SGD\nfrom keras.callbacks import ModelCheckpoint\nfrom keras.layers.convolutional import Conv2D\nfrom keras.layers.convolutional import MaxPooling2D\nfrom keras.utils import np_utils\nfrom keras.preprocessing.image import ImageDataGenerator\nfrom keras.layers import Dense, Activation, Flatten, Dropout, BatchNormalization\nfrom keras import regularizers\nfrom keras.callbacks import LearningRateScheduler\nfrom keras import backend as K\n\nimport matplotlib.pyplot as plt\n\nfrom sklearn.metrics import roc_auc_score\n\nimport boto3\n\nimport fnmatch, os, os.path, re, time\n\nfrom pydub import AudioSegment\nfrom pydub.utils import make_chunks\n\nimport librosa\nimport librosa.display\nfrom sklearn.metrics import roc_auc_score\n\nretrain = False\n\ndata_dir = '/tmp/'\nmodel_dir_name = data_dir + 'model/'\nhome_dir = data_dir \n\ndef on_input(data):\n global retrain\n ## train data available\n retrain = True\n api.send(\"output\", str(data))\n\napi.set_port_callback(\"input\", on_input)\n \n \ndef run_training():\n global retrain\n if retrain:\n out_debug( \"Triggering fresh training\")\n model = train2()\n api.send(\"model\", model)\n retrain = False\n else:\n api.send(\"output\", \"Model already trained\")\n\napi.add_timer(\"2s\", run_training) \n\n\n\ndef train():\n model = RNN()\n \n ## run training\n history = model.fit(x_train,\n y_train,\n epochs=EPOCHES,\n validation_data=(x_test, y_test))\n \n # save model\n model_digit_json = model.to_json()\n with open(model_dir_name+\"rnn_image.json\", \"w\") as json_file:\n json_file.write(model_digit_json)\n # serialize weights to HDF5\n model.save_weights(model_dir_name+'rnn_image.h5')\n \n print(\"Saved model to disk\")\n \n\ndef get_files (d):\n includes = ['*.json', '*.h5', '*.mp3', '*.wav'] # for files only\n excludes = ['/home/ram/doc'] # for dirs and files\n \n # transform glob patterns to regular expressions\n includes = r'|'.join([fnmatch.translate(x) for x in includes])\n excludes = r'|'.join([fnmatch.translate(x) for x in excludes]) or r'$.'\n \n files = []\n for root, dirs, files in os.walk(d):\n out_debug (\"dir {} dir {} files {}\".format(d, dirs, files))\n # exclude dirs\n dirs[:] = [os.path.join(root, d) for d in dirs]\n dirs[:] = [d for d in dirs if not re.match(excludes, d)]\n \n # exclude/include files\n files = [os.path.join(root, f) for f in files]\n files = [f for f in files if not re.match(excludes, f)]\n files = [f for f in files if re.match(includes, f)]\n \n return files\n \n\n## model\n\ndef RNN(): \n model = Sequential()\n model.add(LSTM(128, input_shape=(x_train.shape[1:]), activation='relu', return_sequences=True))\n model.add(Dropout(0.2))\n\n model.add(LSTM(128, activation='relu'))\n model.add(Dropout(0.1))\n\n model.add(Dense(32, activation='relu'))\n model.add(Dropout(0.2))\n\n model.add(Dense(10, activation='softmax'))\n\n model.compile(\n loss='sparse_categorical_crossentropy',\n optimizer='adam',\n metrics=['accuracy'],\n )\n return model\n\n \n##### reuslable modules\ndef get_aws_connection():\n boto3.setup_default_session(region_name='eu-central-1')\n s3 = boto3.client(\n 's3',\n # Hard coded strings as credentials, not recommended.\n aws_access_key_id='',\n aws_secret_access_key='',\n region_name = 'eu-central-1'\n ) \n return s3\n \n\n### extract features\ndef extract_features(input_data_dir):\n x = []\n y = []\n \n classes = [c for c in os.listdir(input_data_dir) if c!='.DS_Store']\n out_debug( \"Audio classes detected: {}\".format(str(classes)) )\n \n for c in classes:\n\n tmpx = []\n tmpy = []\n out_debug('Processing class {}'.format (c))\n\n for file in [f for f in os.listdir(input_data_dir + c) if f!='.DS_Store']:\n f = input_data_dir + c +'/' + file\n out_debug(\"Generating features for file {}\".format(f))\n try:\n wave,sr = librosa.load(f, mono=True)\n mfcc = librosa.feature.mfcc(y=wave, sr=sr, n_mfcc=20)\n mfcc_pad = np.zeros((20, 44))\n mfcc_pad[:mfcc.shape[0], :mfcc.shape[1]] = mfcc[:20, :44]\n \n if mfcc_pad.shape == (20, 44):\n x.append(mfcc_pad)\n tmpx.append(mfcc_pad)\n y.append(classes.index(c))\n tmpy.append(classes.index(c))\n except Exception as e:\n out_debug(\"Error processing audio file {}\".format(str(e)) )\n\n print('complete')\n return np.array(x), np.array(y)\n \n\nclass Monitor_callback(keras.callbacks.Callback):\n def on_train_begin(self, logs={}):\n return\n \n def on_train_end(self, logs={}):\n return\n \n def on_epoch_begin(self, epoch, logs={}):\n return\n \n def on_epoch_end(self, epoch, logs={}):\n #out_debug(\"Epoc {}, Loss: {}\".format( epoch, logs.get('loss') ) )\n #y_pred = self.model.predict(self.model.validation_data[0])\n #out_debug(\"Prediction {} for {}\".format(y_pred, self.model.validation_data[0]) )\n #out_debug(\"Roc score {}\".format( roc_auc_score(self.model.validation_data[1], y_pred ) ) )\n out_debug(\"Epoch: {}, {} \".format(epoch, str(logs)))\n \n def on_batch_begin(self, batch, logs={}):\n return\n \n def on_batch_end(self, batch, logs={}):\n #self.losses.append(logs.get('loss'))\n return \n \ndef train2():\n out_debug(\"Extracting audio MFCC vectors ..\")\n x, y = extract_features(data_dir+'split/')\n out_debug(\"Features extracted, ...shuffling ..\")\n x, y = shuffle(x, y)\n x_train = x.reshape(-1, 20, 44, 1)\n y_train = y.reshape(-1)\n \n #Converting categorical classes to wide format classes [\n y_train = to_categorical(y, num_classes = 2)\n \n\n weight_decay = 1e-4\n model = Sequential()\n model.add(Conv2D(32, (3,3), padding='same', kernel_regularizer=regularizers.l2(weight_decay), input_shape=x_train.shape[1:]))\n model.add(Activation('relu'))\n model.add(BatchNormalization())\n \n model.add(MaxPooling2D(pool_size=(2,2)))\n model.add(Dropout(0.2))\n \n model.add(Conv2D(32, (3,3), padding='same', kernel_regularizer=regularizers.l2(weight_decay), input_shape=x_train.shape[1:]))\n model.add(Activation('relu'))\n model.add(BatchNormalization())\n \n model.add(MaxPooling2D(pool_size=(2,2)))\n model.add(Dropout(0.2))\n \n model.add(Conv2D(32, (3,3), padding='same', kernel_regularizer=regularizers.l2(weight_decay), input_shape=x_train.shape[1:]))\n model.add(Activation('relu'))\n model.add(BatchNormalization())\n \n model.add(MaxPooling2D(pool_size=(2,2)))\n model.add(Dropout(0.2))\n \n model.add(Conv2D(32, (3,3), padding='same', kernel_regularizer=regularizers.l2(weight_decay), input_shape=x_train.shape[1:]))\n model.add(Activation('relu'))\n model.add(BatchNormalization())\n \n model.add(MaxPooling2D(pool_size=(2,2)))\n model.add(Dropout(0.2))\n \n \n model.add(Flatten())\n #model.add(Dense(5, activation='softmax'))\n model.add(Dense(2, activation='softmax')) # num classes\n opt_rms = keras.optimizers.rmsprop(lr=0.001,decay=1e-6)\n model.compile(loss='categorical_crossentropy', optimizer=opt_rms, metrics=['accuracy'])\n \n out_debug(\"Fitting model ..\")\n model.fit(x=x_train,\n y=y_train, epochs=100, validation_split=0.1, callbacks=[Monitor_callback()])\n \n out_debug(\"Fitting model done, model ready for upload\")\n #upload_models(model, 'audio_detect')\n \n return model\n\n\ndef out_debug(message):\n api.send(\"debug\", message)\n \n \n" + }, + "additionalinports": [ + { + "name": "input", + "type": "string" + } + ], + "additionaloutports": [ + { + "name": "output", + "type": "string" + }, + { + "name": "debug", + "type": "string" + }, + { + "name": "model", + "type": "python36" + } + ] + } + }, + "python3operator3": { + "component": "com.sap.system.python3Operator", + "metadata": { + "label": "ModelExport", + "x": 594.999997138977, + "y": 191.5, + "height": 80, + "width": 120, + "extensible": true, + "config": { + "target": "aws", + "script": "\nimport boto3\n\nimport fnmatch, os, os.path, re, time\n\ndata_dir = '/tmp/'\nmodel_dir_name = data_dir + 'model/'\n\n\ndef on_input(data):\n api.send(\"output\", str(data))\n\napi.set_port_callback(\"input\", on_input)\n\n\ndef on_model_in(model):\n out_debug(\"Preparing model for uploading\")\n upload_models(model, api.config.model)\n out_debug(\"Uploading model done\")\n \n\napi.set_port_callback(\"model\", on_model_in)\n\ndef upload_models(model, name): \n out_debug(\"Received model {} for uploading\".format(str(model)))\n if not os.path.exists(model_dir_name):\n out_debug ('creating directory {}'.format(model_dir_name))\n os.makedirs(model_dir_name)\n else:\n out_debug('Using directory {} for staging model artifqcts'.format(model_dir_name))\n\n model_digit_json = model.to_json()\n\n with open(model_dir_name+\"{}.json\".format(name), \"w\") as json_file:\n json_file.write(model_digit_json)\n # serialize weights to HDF5\n model.save_weights(model_dir_name+'{}.h5'.format(name))\n out_debug(\"writing weights to \"+model_dir_name+'{}.h5'.format(name))\n \n local_dir = model_dir_name\n \n out_debug(\"Model serialized and saved, starting upload\")\n ### upload to s3\n boto3.setup_default_session(region_name='eu-central-1')\n s3 = boto3.client(\n 's3',\n # Hard coded strings as credentials, not recommended.\n aws_access_key_id='',\n aws_secret_access_key='',\n region_name = 'eu-central-1'\n ) \n \n bucket_name = 'edgepoc'\n files = get_files(local_dir, ['*.json', '*.h5'])\n out_debug(str(files))\n \n for f in files:\n with open(f, 'rb') as data:\n s3.upload_fileobj(data, bucket_name, 'model/'+f.split('/')[-1])\n\n\ndef get_files (d, includes = ['*.wav', '*.mp3'] ):\n excludes = ['/home/ram/doc'] # for dirs and files\n \n # transform glob patterns to regular expressions\n includes = r'|'.join([fnmatch.translate(x) for x in includes])\n excludes = r'|'.join([fnmatch.translate(x) for x in excludes]) or r'$.'\n \n files = []\n for root, dirs, files in os.walk(d):\n dirs[:] = [os.path.join(root, d) for d in dirs]\n dirs[:] = [d for d in dirs if not re.match(excludes, d)]\n \n # exclude/include files\n files = [os.path.join(root, f) for f in files]\n files = [f for f in files if not re.match(excludes, f)]\n files = [f for f in files if re.match(includes, f)]\n \n return files\n \n \ndef out_debug(message):\n api.send(\"debug\", message)\n \n \n", + "model": "audio_detectx_demo10" + }, + "additionalinports": [ + { + "name": "input", + "type": "string" + }, + { + "name": "model", + "type": "python36" + } + ], + "additionaloutports": [ + { + "name": "output", + "type": "string" + }, + { + "name": "debug", + "type": "string" + } + ] + } + }, + "terminal2": { + "component": "com.sap.util.terminal", + "metadata": { + "label": "Terminal", + "x": 887.9999952316284, + "y": 12, + "height": 80, + "width": 120, + "ui": "dynpath", + "config": {} + } + }, + "terminal3": { + "component": "com.sap.util.terminal", + "metadata": { + "label": "Terminal", + "x": 887.9999952316284, + "y": 372, + "height": 80, + "width": 120, + "ui": "dynpath", + "config": {} + } + }, + "terminal1": { + "component": "com.sap.util.terminal", + "metadata": { + "label": "Terminal", + "x": 887.9999952316284, + "y": 132, + "height": 80, + "width": 120, + "ui": "dynpath", + "config": {} + } + }, + "terminal4": { + "component": "com.sap.util.terminal", + "metadata": { + "label": "Terminal", + "x": 887.9999952316284, + "y": 252, + "height": 80, + "width": 120, + "ui": "dynpath", + "config": {} + } + } + }, + "groups": [ + { + "name": "group1", + "nodes": [ + "python3operator2", + "python3operator4", + "python3operator1", + "python3operator3" + ], + "metadata": { + "description": "Group" + }, + "tags": { + "edgeml2": "", + "python36": "", + "tornado": "5.0.2" + }, + "multiplicity": 1 + } + ], + "connections": [ + { + "metadata": { + "points": "517.9999980926514,217.00000023841858 545.9999976158142,217.00000023841858 545.9999976158142,162.50000047683716 746.9999966621399,162.50000047683716 746.9999966621399,223.5 838.9999957084656,223.5 838.9999957084656,52 882.9999952316284,52" + }, + "src": { + "port": "debug", + "process": "python3operator1" + }, + "tgt": { + "port": "in1", + "process": "terminal2" + } + }, + { + "metadata": { + "points": "148,249 175.99999952316284,249 175.99999952316284,293.4999997615814 360.9999985694885,293.4999997615814 360.9999985694885,301.9999997615814 545.9999976158142,301.9999997615814 545.9999976158142,316.49999952316284 762.9999966621399,316.49999952316284 762.9999966621399,256.5 838.9999957084656,256.5 838.9999957084656,412 882.9999952316284,412" + }, + "src": { + "port": "debug", + "process": "python3operator2" + }, + "tgt": { + "port": "in1", + "process": "terminal3" + } + }, + { + "metadata": { + "points": "517.9999980926514,199.00000023841858 561.9999976158142,199.00000023841858 561.9999976158142,222.5 589.999997138977,222.5" + }, + "src": { + "port": "output", + "process": "python3operator1" + }, + "tgt": { + "port": "input", + "process": "python3operator3" + } + }, + { + "metadata": { + "points": "332.9999990463257,216.50000023841858 360.9999985694885,216.50000023841858 360.9999985694885,217.00000023841858 388.99999809265137,217.00000023841858" + }, + "src": { + "port": "output", + "process": "python3operator4" + }, + "tgt": { + "port": "input", + "process": "python3operator1" + } + }, + { + "metadata": { + "points": "332.9999990463257,234.50000023841858 360.9999985694885,234.50000023841858 360.9999985694885,284.9999997615814 561.9999976158142,284.9999997615814 561.9999976158142,299.49999952316284 746.9999966621399,299.49999952316284 746.9999966621399,245.5 854.9999957084656,245.5 854.9999957084656,292 882.9999952316284,292" + }, + "src": { + "port": "debug", + "process": "python3operator4" + }, + "tgt": { + "port": "in1", + "process": "terminal4" + } + }, + { + "metadata": { + "points": "517.9999980926514,235.00000023841858 545.9999976158142,235.00000023841858 545.9999976158142,240.5 589.999997138977,240.5" + }, + "src": { + "port": "model", + "process": "python3operator1" + }, + "tgt": { + "port": "model", + "process": "python3operator3" + } + }, + { + "metadata": { + "points": "718.999997138977,240.5 746.9999966621399,240.5 746.9999966621399,234.5 854.9999957084656,234.5 854.9999957084656,172 882.9999952316284,172" + }, + "src": { + "port": "debug", + "process": "python3operator3" + }, + "tgt": { + "port": "in1", + "process": "terminal1" + } + }, + { + "metadata": { + "points": "148,231 175.99999952316284,231 175.99999952316284,225.50000023841858 203.99999904632568,225.50000023841858" + }, + "src": { + "port": "output", + "process": "python3operator2" + }, + "tgt": { + "port": "input", + "process": "python3operator4" + } + } + ], + "inports": {}, + "outports": {} +} \ No newline at end of file diff --git a/edge-ml-welding-sound/src/edge/pom.xml b/edge-ml-welding-sound/src/edge/pom.xml new file mode 100644 index 0000000..67a91ea --- /dev/null +++ b/edge-ml-welding-sound/src/edge/pom.xml @@ -0,0 +1,116 @@ + + 4.0.0 + + com.sap.iot.edgeservices + EdgeML + 1.0.0 + + + ${project.artifactId} + ${project.version} + + + bundle + + + + org.apache.logging.log4j + log4j-api + 2.11.2 + provided + + + + org.apache.logging.log4j + log4j-core + 2.11.2 + provided + + + + commons-io + commons-io + 2.6 + + + + org.osgi + org.osgi.core + 6.0.0 + provided + + + + org.osgi + org.osgi.compendium + 5.0.0 + provided + + + + com.sap.iot.edgeservices + ConfigService + 3.1909.0 + provided + + + + org.tensorflow + tensorflow + 1.14.0 + + + + + + + org.apache.maven.plugins + maven-compiler-plugin + 3.3 + + 1.8 + 1.8 + + + + + org.apache.felix + maven-bundle-plugin + 3.2.0 + true + + + package + + bundle + + + + com.sap.iot.edgeservices.edgeml.Activator + ${project.artifactId} + ${bundle.version} + SAP + . + + *;scope=compile|runtime + + + com.sap.iot.edgeservices.configservice.service, + org.osgi.framework, + org.osgi.service.event, + org.apache.logging.log4j;version="[2.9.0,4.0.0]", + org.apache.logging.log4j.core;version="[2.9.0,4.0.0]", + org.apache.logging.log4j.core.lookup;version="[2.9.0,4.0.0]", + org.apache.logging.log4j.core.config;version="[2.9.0,4.0.0]", + org.apache.logging.log4j.core.layout;version="[2.9.0,4.0.0]", + org.apache.logging.log4j.core.appender;version="[2.9.0,4.0.0]", + org.apache.logging.log4j.core.appender.rolling;version="[2.9.0,4.0.0]" + + + + + + + + + \ No newline at end of file diff --git a/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/Activator.java b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/Activator.java new file mode 100644 index 0000000..c6fdd54 --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/Activator.java @@ -0,0 +1,254 @@ +package com.sap.iot.edgeservices.edgeml; + +import java.io.File; +import java.io.IOException; +import java.io.InputStream; +import java.net.URL; +import java.nio.file.Files; +import java.nio.file.InvalidPathException; +import java.nio.file.Path; +import java.nio.file.Paths; +import java.nio.file.StandardCopyOption; +import java.util.Arrays; +import java.util.Dictionary; +import java.util.Enumeration; +import java.util.Hashtable; +import java.util.List; +import java.util.concurrent.TimeUnit; + +import org.apache.commons.io.FilenameUtils; +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.osgi.framework.Bundle; +import org.osgi.framework.BundleActivator; +import org.osgi.framework.BundleContext; +import org.osgi.framework.BundleException; +import org.osgi.framework.Version; +import org.osgi.service.event.EventConstants; +import org.osgi.service.event.EventHandler; + +import com.sap.iot.edgeservices.edgeml.util.SetupUtils; + + +public class Activator implements BundleActivator { + + private static final Logger LOGGER = LogManager.getLogger(Activator.class); + + // The parent "edgeservices/" directory + private static final String EDGE_SERVICES_DIRECTORY = Paths.get(Paths.get(System.getProperty("user.dir")).getParent().toString(), "edgeservices").toString(); + private static final String ML_POC_DIRECTORY = Paths.get(EDGE_SERVICES_DIRECTORY, "ml_poc").toString(); + + private static final String ML_POC_BIN_DIRECTORY = Paths.get(ML_POC_DIRECTORY, "bin").toString(); + private static final String ML_POC_MODEL_DIRECTORY = Paths.get(ML_POC_DIRECTORY, "model").toString(); + private static final String ML_POC_IMAGE_DIRECTORY = Paths.get(ML_POC_DIRECTORY, "image").toString(); + private static final String ML_POC_LOG_DIRECTORY = Paths.get(ML_POC_DIRECTORY, "logs").toString(); + + private static final String ML_POC_SPLIT_DIRECTORY = Paths.get(ML_POC_DIRECTORY, "split").toString(); + private static final String ML_POC_SPLIT_PROCESSED_DIRECTORY = Paths.get(ML_POC_SPLIT_DIRECTORY, "processed").toString(); + private static final String ML_POC_INPUT_DIRECTORY = Paths.get(ML_POC_DIRECTORY, "input").toString(); + private static final String ML_POC_INPUT_PROCESSED_DIRECTORY = Paths.get(ML_POC_INPUT_DIRECTORY, "processed").toString(); + private static final String ML_POC_OUTPUT_DIRECTORY = Paths.get(ML_POC_DIRECTORY, "output").toString(); + + private Process edgeDaemonProcess; + private Process tfServerProcess; + + /* + * BUNDLE ACTIVATOR METHODS: + * These methods are used for customizing the starting and stopping of this bundle. + */ + @Override + public void start(BundleContext context) throws Exception { + // Handle bundle upgrade. If another version of this bundle exists that has a lower version number, + // uninstall and delete it. + Bundle bundle = context.getBundle(); + String bundleName = bundle.getSymbolicName(); + Version bundleVersion = bundle.getVersion(); + + try { + + for (Bundle installedBundle : context.getBundles()) { + if (bundleName.equals(installedBundle.getSymbolicName()) && installedBundle.getVersion().compareTo(bundleVersion) < 0) { + try { + installedBundle.uninstall(); + String installedBundlePath = installedBundle.getLocation(); + installedBundlePath = installedBundlePath.replaceFirst("initial@reference:", ""); + installedBundlePath = installedBundlePath.replaceFirst("file:", ""); + installedBundlePath = FilenameUtils.separatorsToSystem(installedBundlePath); + try { + Files.deleteIfExists(Paths.get(installedBundlePath)); + } catch (IOException | InvalidPathException e) { + LOGGER.warn("Cannot delete older version of bundle. Bundle will be deleted on shutdown: {}", e.getMessage()); + new File(installedBundlePath).deleteOnExit(); + } + } catch (BundleException | IllegalStateException e) { + LOGGER.error("Cannot uninstall older version of bundle: {}", e.getMessage()); + } + } + } + } catch (IllegalStateException e) { + LOGGER.error("Cannot get bundle: {}", e.getMessage()); + } + + // Register this class to listen over Event Admin for activation requests with the topic "CUSTOM" + Dictionary properties = new Hashtable<>(); + properties.put(EventConstants.EVENT_TOPIC, ConfigServiceEventHandler.EVENT_TOPIC); + context.registerService(EventHandler.class, new ConfigServiceEventHandler(), properties); + + // Register the OSGi bundle commands to the namespace "EdgeML" (see class "BundleCommands" defined below) + properties.put("osgi.command.scope", "EdgeML"); + properties.put("osgi.command.function", new String[] { + "startEdgeDaemon", "stopEdgeDaemon", "startTensorFlowServer", "stopTensorFlowServer" + }); + context.registerService(BundleCommands.class.getName(), new BundleCommands(this), properties); + + // Set up directory structure and extract bundle resources (script, model, etc.) + LOGGER.info("Setting up directories ..."); + SetupUtils.createDirectories(ML_POC_DIRECTORY, ML_POC_BIN_DIRECTORY, ML_POC_LOG_DIRECTORY, + ML_POC_IMAGE_DIRECTORY, ML_POC_SPLIT_DIRECTORY, ML_POC_SPLIT_PROCESSED_DIRECTORY, + ML_POC_INPUT_DIRECTORY, ML_POC_INPUT_PROCESSED_DIRECTORY, ML_POC_OUTPUT_DIRECTORY); + + + LOGGER.info("Extracting bundle resources ..."); + try { + copyBundleResource(bundle, "bin"); + copyBundleResource(bundle, "model"); + } catch (Exception e) { + LOGGER.error("Could not extract resource."); + } + + LOGGER.info("Edge ML PoC started"); + } + + @Override + public void stop(BundleContext context) throws Exception { + stopProcess(edgeDaemonProcess, "Edge ML Daemon script"); + stopProcess(tfServerProcess, "TensorFlow Server"); + + LOGGER.info("Edge ML PoC stopped"); + } + + private void copyBundleResource(Bundle bundle, String bundleResource) { + LogManager.getLogger().info("Copying {} bundle resource", bundleResource); + try { + Enumeration e = bundle.findEntries(bundleResource, "*", true); + while (e.hasMoreElements()) { + URL resourceURL = e.nextElement(); + if (resourceURL.getFile().endsWith("/")) { + Path resourceDirectory = Paths.get(ML_POC_DIRECTORY, resourceURL.getPath()); + Files.createDirectories(resourceDirectory); + } else { + Path resourceFile = Paths.get(ML_POC_DIRECTORY, resourceURL.getPath()); + if (Files.notExists(resourceFile)) { + Files.createDirectories(resourceFile.getParent()); + try (InputStream is = resourceURL.openStream()) { + Files.copy(is, resourceFile, StandardCopyOption.REPLACE_EXISTING); + } + } + } + } + } catch (IOException e) { + LOGGER.error("Unable to copy bundle resource {}. {}", bundleResource, e); + } + } + + private void stopProcess(Process p) { + stopProcess(p, "background process"); + } + + private void stopProcess(Process p, String name) { + if (p != null && p.isAlive()) { + LOGGER.info("Stopping {} ...", name); + p.destroy(); + try { + p.waitFor(10, TimeUnit.SECONDS); + } catch (InterruptedException e) { + Thread.currentThread().interrupt(); + } + + if (p.isAlive()) { + LOGGER.warn("{} is still running after termination called. Stopping forcibly ...", name); + p.destroyForcibly(); + } + } else { + LOGGER.info("{} is not running.", name); + } + } + + /* + * OSGI BUNDLE COMMANDS: + * This class defines a set of OSGi commands that can be used from the command line. + */ + public class BundleCommands { + Activator activator; + public BundleCommands(Activator activator) { + this.activator = activator; + } + + public void startEdgeDaemon() { + List command = Arrays.asList( + "python3", + Paths.get(ML_POC_DIRECTORY, "bin", "edge_ml_daemon.py").toString() + ); + + Thread t = new Thread( () -> { + LOGGER.info("Starting Edge ML Daemon script ..."); + try { + ProcessBuilder pb = new ProcessBuilder(); + pb.command(command); + pb.directory(new File(ML_POC_BIN_DIRECTORY)); + pb.inheritIO(); + + edgeDaemonProcess = pb.start(); + } catch (IOException ioe) { + LOGGER.error("Error occurred while running Edge ML Daemon script. {}", ioe); + } + }); + t.start(); + } + + public void stopEdgeDaemon() { + stopProcess(edgeDaemonProcess, "Edge ML Daemon script"); + } + + public void startTensorFlowServer() { + ProcessBuilder tfpb = new ProcessBuilder(); + + // TODO: Move these variables to somewhere where they can be configurable + String port = "8500"; + String restAPIPort = "8501"; + String modelName = "Volvo01"; + + List command = Arrays.asList( + "tensorflow_model_server", + "--port=" + port, + "--rest_api_port=" + restAPIPort, + "--model_name=" + modelName, + "--model_base_path=" + Paths.get(ML_POC_MODEL_DIRECTORY, modelName).toString() + ); + + Thread t = new Thread( () -> { + LOGGER.info("Starting TensorFlow Serving with model name [{}] on port [{}]. REST API Port [{}]", + modelName, port, restAPIPort); + try { + File f = new File(ML_POC_LOG_DIRECTORY, "tensorflow_server.log"); + f.createNewFile(); + + tfpb.command(command); + tfpb.directory(new File(ML_POC_MODEL_DIRECTORY)); + tfpb.redirectInput(f); + tfpb.redirectOutput(f); + tfpb.redirectErrorStream(true); + + tfServerProcess = tfpb.start(); + } catch (IOException ioe) { + LOGGER.error("Error occurred while running TensorFlow Server. {}", ioe); + } + }); + t.start(); + } + + public void stopTensorFlowServer() { + stopProcess(tfServerProcess, "TensorFlow Server"); + } + } +} diff --git a/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/ConfigServiceEventHandler.java b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/ConfigServiceEventHandler.java new file mode 100644 index 0000000..feba638 --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/ConfigServiceEventHandler.java @@ -0,0 +1,164 @@ +package com.sap.iot.edgeservices.edgeml; + +import java.io.File; +import java.io.FileOutputStream; +import java.io.IOException; +import java.io.OutputStream; +import java.nio.charset.StandardCharsets; +import java.nio.file.Files; +import java.nio.file.Paths; +import java.util.List; +import java.util.Optional; +import java.util.Properties; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.osgi.service.component.annotations.Component; +import org.osgi.service.component.annotations.Reference; +import org.osgi.service.component.annotations.ReferenceCardinality; +import org.osgi.service.component.annotations.ReferencePolicy; +import org.osgi.service.component.annotations.ReferencePolicyOption; +import org.osgi.service.event.Event; +import org.osgi.service.event.EventHandler; + +import com.fasterxml.jackson.databind.ObjectMapper; +import com.sap.iot.edgeservices.configservice.service.IConfigStatusService; +import com.sap.iot.edgeservices.edgeml.config.EdgeMLConfiguration; +import com.sap.iot.edgeservices.edgeml.config.IoTServiceInformation; +import com.sap.iot.edgeservices.edgeml.config.LoggingInformation; +import com.sap.iot.edgeservices.edgeml.config.TensorFlowServerInformation; + +// Instantiate this component immediately so it can start listening for config activation request events right away +@Component(immediate = true) +public final class ConfigServiceEventHandler implements EventHandler { + + private static final Logger LOGGER = LogManager.getLogger(ConfigServiceEventHandler.class); + + private static final String EDGE_SERVICES_DIRECTORY = Paths.get(Paths.get(System.getProperty("user.dir")).getParent().toString(), "edgeservices").toString(); + private static final String ML_POC_DIRECTORY = Paths.get(EDGE_SERVICES_DIRECTORY, "ml_poc").toString(); + + private static final String CONFIG_FILE_PROPERTY_NAME = "configFile"; + private static final String CONFIG_FINGERPRINT_PROPERTY_NAME = "configFingerprint"; + + // Keeps track of the last successful fingerprint for the config file that was activated + private static String lastSuccessfulFingerprint = ""; + + private static IConfigStatusService configStatusService; + + // The Event Admin topic to subscribe to for config activation requests + public static final String EVENT_TOPIC = "MLPOC"; + + + /* + * EVENT HANDLER METHOD: + * This method is used to handle activation requests sent by the ConfigService bundle through Event Admin. + */ + @Override + public synchronized void handleEvent(Event event) { + // Check to see if the event received conforms to a config activation event + // i.e. the event contains the config file to be activated and its associated fingerprint + if (event.getProperty(CONFIG_FILE_PROPERTY_NAME) instanceof File + && event.getProperty(CONFIG_FINGERPRINT_PROPERTY_NAME) instanceof String) { + File configFile = (File) event.getProperty(CONFIG_FILE_PROPERTY_NAME); + String fingerprint = (String) event.getProperty(CONFIG_FINGERPRINT_PROPERTY_NAME); + + // Return if the sent config file has already been activated + if (lastSuccessfulFingerprint.equals(fingerprint)) { + return; + } + + getConfigStatusService().ifPresent(configStatusService -> { + try { + String configFileContents = new String(Files.readAllBytes(configFile.toPath()), StandardCharsets.UTF_8); + LOGGER.debug("Config File Contents:\n{}", configFileContents); + + // Set the lastSuccessfulFingerprint to this config file's fingerprint if the config file was successfully activated + // Call the activationStatus Declarative Service with the activation result (true or false), fingerprint, and a status message + if ( processConfiguration(configFile) ) { + lastSuccessfulFingerprint = fingerprint; + configStatusService.activationStatus(true, fingerprint, "Activation Succeeded"); + } else { + configStatusService.activationStatus(false, fingerprint, "Activation Failed"); + } + } catch (IOException e) { + configStatusService.activationStatus(false, fingerprint, "Cannot read config file: " + e.getMessage()); + } + }); + } + } + + private Boolean processConfiguration(File configFile) { + final ObjectMapper om = new ObjectMapper(); + + try (OutputStream out = new FileOutputStream(Paths.get(ML_POC_DIRECTORY, "config.properties").toFile())) { + EdgeMLConfiguration config = om.readValue(configFile, EdgeMLConfiguration.class); + + Properties props = new Properties(); + + // Set the IoT Service information + IoTServiceInformation iotInfo = config.getOutputIoTService(); + if (iotInfo != null) { + LOGGER.trace("Found IoT Service information"); + + props.setProperty("iotservice.deviceId", iotInfo.getDeviceId()); + props.setProperty("iotservice.sensorId", iotInfo.getSensorId()); + props.setProperty("iotservice.sensorTypeAlternateId", iotInfo.getSensorTypeAlternateId()); + props.setProperty("iotservice.capabilityAlternateId", iotInfo.getCapabilityAlternateId()); + props.setProperty("iotservice.send", Boolean.toString(iotInfo.getSendToIoTService())); + } + + // Set the TensorFlow Server information + TensorFlowServerInformation tfInfo = config.getTfServerInformation(); + if (tfInfo != null) { + LOGGER.trace("Found TensorFlow Server information"); + + props.setProperty("tf.port", tfInfo.getPort()); + props.setProperty("tf.restAPIPort", tfInfo.getRestAPIPort()); + props.setProperty("tf.modelName", tfInfo.getModelName()); + } + + LoggingInformation logInfo = config.getLogging(); + if (logInfo != null) { + LOGGER.trace("Found Logging information"); + + props.setProperty("log.bundle", logInfo.getBundleLogLevel()); + props.setProperty("log.daemon", logInfo.getDaemonLogLevel()); + } + + if (config.getClassifications() != null) { + props.setProperty("classifications", config.getClassifications().toString()); + } + + props.setProperty("modelDownloadInterval", Float.toString(config.getModelDownloadInterval())); + + props.store(out, null); + + } catch (IOException ioe) { + LOGGER.error("Failed to process configuration."); + return false; + } + + LOGGER.info("Configuration file processed successfully."); + return true; + } + + /* + * DECLARATIVE SERVICES METHODS: + * These methods are used by Declarative Services to set, unset, and get an IConfigStatusService instance. + */ + @Reference(cardinality = ReferenceCardinality.OPTIONAL, policy = ReferencePolicy.DYNAMIC, policyOption = ReferencePolicyOption.GREEDY) + void setConfigStatusService(IConfigStatusService arg) { + configStatusService = arg; + } + + void unsetConfigStatusService(IConfigStatusService arg) { + if (configStatusService == arg) { + configStatusService = null; + } + } + + private Optional getConfigStatusService() { + return Optional.ofNullable(configStatusService); + } + +} diff --git a/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/EdgeMLConfiguration.java b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/EdgeMLConfiguration.java new file mode 100644 index 0000000..86d6354 --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/EdgeMLConfiguration.java @@ -0,0 +1,81 @@ +package com.sap.iot.edgeservices.edgeml.config; + +import java.io.Serializable; +import java.util.List; + +import com.fasterxml.jackson.annotation.JsonProperty; + +public class EdgeMLConfiguration implements Serializable { + + /** + * Generated serial version UID. + */ + private static final long serialVersionUID = -7046615135308372173L; + + @JsonProperty(value = "outputIoTService") + private IoTServiceInformation outputIoTService; + + @JsonProperty(value = "tensorflowServer") + private TensorFlowServerInformation tfServerInformation; + + @JsonProperty(value = "logging") + private LoggingInformation logging; + + @JsonProperty(value = "classifications") + private List classifications; + + @JsonProperty(value = "modelDownloadInterval") + private float modelDownloadInterval = 2880.0f; // default download interval of 2 days + + @JsonProperty(value = "configVersion") + private String configVersion; + + + public IoTServiceInformation getOutputIoTService() { + return outputIoTService; + } + + public void setOutputIoTService(IoTServiceInformation outputIoTService) { + this.outputIoTService = outputIoTService; + } + + public TensorFlowServerInformation getTfServerInformation() { + return tfServerInformation; + } + + public void setTfServerInformation(TensorFlowServerInformation tfServerInformation) { + this.tfServerInformation = tfServerInformation; + } + + public LoggingInformation getLogging() { + return logging; + } + + public void setLogging(LoggingInformation logging) { + this.logging = logging; + } + + public List getClassifications() { + return classifications; + } + + public void setClassifications(List classifications) { + this.classifications = classifications; + } + + public float getModelDownloadInterval() { + return modelDownloadInterval; + } + + public void setModelDownloadInterval(float modelDownloadInterval) { + this.modelDownloadInterval = modelDownloadInterval; + } + + public String getConfigVersion() { + return configVersion; + } + + public void setConfigVersion(String configVersion) { + this.configVersion = configVersion; + } +} diff --git a/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/IoTServiceInformation.java b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/IoTServiceInformation.java new file mode 100644 index 0000000..ea9a8be --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/IoTServiceInformation.java @@ -0,0 +1,60 @@ +package com.sap.iot.edgeservices.edgeml.config; + +import java.io.Serializable; + +import com.fasterxml.jackson.annotation.JsonProperty; + +public class IoTServiceInformation implements Serializable { + + /** + * Generated serial version UID. + */ + private static final long serialVersionUID = 7925181553233084834L; + + @JsonProperty(value = "deviceId") + private String deviceId; + + @JsonProperty(value = "sensorId") + private String sensorId; + + @JsonProperty(value = "sensorTypeAlternateId") + private String sensorTypeAlternateId; + + @JsonProperty(value = "capabilityAlternateId") + private String capabilityAlternateId; + + @JsonProperty(value = "send") + private Boolean sendToIoTService = true; + + + public String getDeviceId() { + return deviceId; + } + public void setDeviceId(String deviceId) { + this.deviceId = deviceId; + } + public String getSensorId() { + return sensorId; + } + public void setSensorId(String sensorId) { + this.sensorId = sensorId; + } + public String getSensorTypeAlternateId() { + return sensorTypeAlternateId; + } + public void setSensorTypeAlternateId(String sensorTypeAlternateId) { + this.sensorTypeAlternateId = sensorTypeAlternateId; + } + public String getCapabilityAlternateId() { + return capabilityAlternateId; + } + public void setCapabilityAlternateId(String capabilityAlternateId) { + this.capabilityAlternateId = capabilityAlternateId; + } + public Boolean getSendToIoTService() { + return sendToIoTService; + } + public void setSendToIoTService(Boolean sendToIoTService) { + this.sendToIoTService = sendToIoTService; + } +} diff --git a/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/LoggingInformation.java b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/LoggingInformation.java new file mode 100644 index 0000000..c783541 --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/LoggingInformation.java @@ -0,0 +1,37 @@ +package com.sap.iot.edgeservices.edgeml.config; + +import java.io.Serializable; + +import com.fasterxml.jackson.annotation.JsonProperty; + +public class LoggingInformation implements Serializable { + + /** + * Generated serial version UID. + */ + private static final long serialVersionUID = -2920009961164636668L; + + @JsonProperty(value = "bundleLogLevel") + private String bundleLogLevel = "INFO"; + + @JsonProperty(value = "daemonLogLevel") + private String daemonLogLevel = "INFO"; + + + public String getBundleLogLevel() { + return bundleLogLevel; + } + + public void setBundleLogLevel(String bundleLogLevel) { + this.bundleLogLevel = bundleLogLevel; + } + + public String getDaemonLogLevel() { + return daemonLogLevel; + } + + public void setDaemonLogLevel(String daemonLogLevel) { + this.daemonLogLevel = daemonLogLevel; + } + +} diff --git a/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/TensorFlowServerInformation.java b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/TensorFlowServerInformation.java new file mode 100644 index 0000000..d545a0f --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/config/TensorFlowServerInformation.java @@ -0,0 +1,43 @@ +package com.sap.iot.edgeservices.edgeml.config; + +import java.io.Serializable; + +import com.fasterxml.jackson.annotation.JsonProperty; + +public class TensorFlowServerInformation implements Serializable { + + /** + * Generated serial version UID. + */ + private static final long serialVersionUID = -700133741709917263L; + + @JsonProperty(value = "port") + private String port; + + @JsonProperty(value = "restAPIPort") + private String restAPIPort; + + @JsonProperty(value = "modelName") + private String modelName; + + + public String getPort() { + return port; + } + public void setPort(String port) { + this.port = port; + } + public String getRestAPIPort() { + return restAPIPort; + } + public void setRestAPIPort(String restAPIPort) { + this.restAPIPort = restAPIPort; + } + public String getModelName() { + return modelName; + } + public void setModelName(String modelName) { + this.modelName = modelName; + } + +} diff --git a/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/util/DirectoryWatcher.java b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/util/DirectoryWatcher.java new file mode 100644 index 0000000..4d99fc8 --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/util/DirectoryWatcher.java @@ -0,0 +1,253 @@ +package com.sap.iot.edgeservices.edgeml.util; + +import static java.nio.file.StandardWatchEventKinds.ENTRY_CREATE; +import static java.nio.file.StandardWatchEventKinds.ENTRY_DELETE; +import static java.nio.file.StandardWatchEventKinds.ENTRY_MODIFY; +import static java.nio.file.StandardWatchEventKinds.OVERFLOW; + +import java.io.Closeable; +import java.io.FileNotFoundException; +import java.io.IOException; +import java.nio.file.ClosedWatchServiceException; +import java.nio.file.FileSystems; +import java.nio.file.FileVisitResult; +import java.nio.file.Files; +import java.nio.file.LinkOption; +import java.nio.file.Path; +import java.nio.file.Paths; +import java.nio.file.SimpleFileVisitor; +import java.nio.file.WatchEvent; +import java.nio.file.WatchEvent.Kind; +import java.nio.file.WatchKey; +import java.nio.file.WatchService; +import java.nio.file.attribute.BasicFileAttributes; +import java.util.HashMap; +import java.util.HashSet; +import java.util.Map; +import java.util.Set; + +/** + * Watches a directory for changes and notifies of those changes. + */ +public class DirectoryWatcher implements Closeable { + public interface DirectoryListener { + default void fileCreated(Path path) { + }; + + default void fileUpdated(Path path) { + }; + + default void fileDeleted(Path path) { + }; + + default void directoryCreated(Path path) { + }; + + default void directoryDeleted(Path path) { + }; + } + + private final WatchService watcher; + private final Map watchKeys; + private final Path directory; + private final boolean recursive; + private final Kind[] watchEventKinds; + private final DirectoryListener listener; + private Thread thread; + + // Need to track which paths are directories so that when ENTRY_DELETE event comes we can tell if it's a directory + // or file. Calling Files.isDirectory() won't work because the file is already deleted. + private final Set directories; + + public DirectoryWatcher(Path directory, boolean recursive, DirectoryListener listener, Kind... events) + throws IOException { + + if (!Files.isDirectory(directory, LinkOption.NOFOLLOW_LINKS)) { + throw new FileNotFoundException("Directory " + directory + " not found"); + } + + watcher = FileSystems.getDefault().newWatchService(); + watchKeys = new HashMap(); + directories = new HashSet<>(); + + this.directory = directory; + this.recursive = recursive; + this.listener = listener; + this.watchEventKinds = events; + + if (recursive) { + walkTree(directory, true); + } else { + walkTree(directory, false); + register(directory); + } + + thread = new Thread(this::processEvents); + thread.setDaemon(true); + thread.setName(DirectoryWatcher.class.getSimpleName() + "(" + directory + ")"); + thread.start(); + } + + public Path getDirectory() { + return directory; + } + + @Override + public void close() throws IOException { + // Closing the watcher triggers the thread to exit + watcher.close(); + + // Wait for the thread to exit + try { + thread.join(); + } catch (InterruptedException e) { + } + + directories.clear(); + } + + private void register(Path path) throws IOException { + WatchKey key = path.register(watcher, watchEventKinds); + watchKeys.put(key, path); + } + + private void walkTree(Path path, boolean register) throws IOException { + Files.walkFileTree(path, new SimpleFileVisitor() { + @Override + public FileVisitResult preVisitDirectory(Path dir, BasicFileAttributes attrs) throws IOException { + if (register) { + register(dir); + } + + directories.add(dir); + + return FileVisitResult.CONTINUE; + } + }); + } + + private void processEvents() { + while (true) { + // Wait for WatchKey to be signaled + WatchKey watchKey; + try { + watchKey = watcher.take(); + } catch (InterruptedException | ClosedWatchServiceException e) { + break; + } + + // Prevent duplicate file ENTRY_MODIFY events (the contents is modified then the timestamp) + try { + Thread.sleep(50); + } catch (InterruptedException e) { + } + + // Gets the Path for the WatchKey + Path parentPath = watchKeys.get(watchKey); + if (parentPath == null) { + continue; + } + + // Iterate over all WatchEvents on the WatchKey + for (WatchEvent event : watchKey.pollEvents()) { + // An overflow event indicating events were lost + Kind kind = event.kind(); + if (event.kind() == OVERFLOW) { + continue; + } + + // Context for event is the relative path to the file. Combine with parent path to get full path to + // file. + @SuppressWarnings("unchecked") + Path childPath = parentPath.resolve(((WatchEvent) event).context()); + + boolean isDirectory; + if (kind == ENTRY_CREATE) { + isDirectory = Files.isDirectory(childPath, LinkOption.NOFOLLOW_LINKS); + } else { + isDirectory = directories.contains(childPath); + } + + if (isDirectory) { + if (kind == ENTRY_CREATE) { + directories.add(childPath); + + // If a directory was created and recursive flag set then also register the directory + if (recursive) { + try { + walkTree(childPath, true); + } catch (IOException e) { + // TODO: Handle this + } + } + + listener.directoryCreated(childPath); + } else if (kind == ENTRY_MODIFY) { + // Ignore directory modify events which are fired when files or directories within it change + } else if (kind == ENTRY_DELETE) { + directories.remove(childPath); + + listener.directoryDeleted(childPath); + } + } else { + if (kind == ENTRY_CREATE) { + listener.fileCreated(childPath); + } else if (kind == ENTRY_MODIFY) { + listener.fileUpdated(childPath); + } else if (kind == ENTRY_DELETE) { + listener.fileDeleted(childPath); + } + } + } + + // Reset the WatchKey and remove from set if directory no longer accessible + boolean valid = watchKey.reset(); + if (!valid) { + watchKeys.remove(watchKey); + + // If all directories are inaccessible then exit + if (watchKeys.isEmpty()) { + break; + } + } + } + } + + public static void main(String args[]) { + try { + Path directory = Paths.get("C:\\Temp"); + + DirectoryListener listener = new DirectoryListener() { + public void fileCreated(Path path) { + System.out.println("File created: " + path); + } + + public void fileUpdated(Path path) { + System.out.println("File updated: " + path); + } + + public void fileDeleted(Path path) { + System.out.println("File deleted: " + path); + } + + public void directoryCreated(Path path) { + System.out.println("Directory created: " + path); + } + + public void directoryDeleted(Path path) { + System.out.println("Directory deleted: " + path); + } + }; + + DirectoryWatcher watcher = new DirectoryWatcher(directory, true, listener, ENTRY_CREATE, ENTRY_MODIFY, + ENTRY_DELETE); + + System.out.println("Watching directory " + directory + " ..."); + Thread.sleep(60000); + + watcher.close(); + } catch (Exception e) { + e.printStackTrace(); + } + } +} \ No newline at end of file diff --git a/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/util/SetupUtils.java b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/util/SetupUtils.java new file mode 100644 index 0000000..b8f6cd8 --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/java/com/sap/iot/edgeservices/edgeml/util/SetupUtils.java @@ -0,0 +1,34 @@ +package com.sap.iot.edgeservices.edgeml.util; + +import java.io.IOException; +import java.nio.file.Files; +import java.nio.file.Path; +import java.nio.file.Paths; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; + + +public final class SetupUtils { + private static final Logger LOGGER = LogManager.getLogger(SetupUtils.class); + + private SetupUtils() {} + + public static void createDirectory(String directory) { + createDirectories(directory); + } + + public static void createDirectories(String... directories) { + for (String directory : directories) { + Path directoryPath = Paths.get(directory); + if (!Files.exists(directoryPath)) { + try { + LOGGER.debug("Creating directory {}", directoryPath); + Files.createDirectories(directoryPath); + } catch (IOException ioe) { + LOGGER.error("Cannot create directory: {}", ioe.getMessage()); + } + } + } + } +} diff --git a/edge-ml-welding-sound/src/edge/src/main/resources/bin/edge_ml_daemon.py b/edge-ml-welding-sound/src/edge/src/main/resources/bin/edge_ml_daemon.py new file mode 100644 index 0000000..81a04ef --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/resources/bin/edge_ml_daemon.py @@ -0,0 +1,488 @@ +from __future__ import print_function +import tensorflow as tf +import numpy as np +import keras + +from keras.preprocessing.image import ImageDataGenerator, array_to_img, img_to_array, load_img +from keras.models import Sequential +from keras.layers import Activation, Dropout, Flatten, Dense, LSTM, Activation, Dense, Dropout, Input, Embedding, TimeDistributed, Conv2D, MaxPooling2D +from keras import backend as K +from keras.preprocessing import image + +from keras.models import Sequential +from keras.layers import Dense, Dropout, LSTM +import matplotlib.pyplot as plt + +from keras.utils.np_utils import to_categorical # convert to one-hot-encoding +from sklearn.utils import shuffle +from keras import regularizers +from keras.datasets import cifar10 +from keras.preprocessing.image import ImageDataGenerator +from keras.models import Sequential +from keras.optimizers import SGD +from keras.callbacks import ModelCheckpoint +from keras.layers.convolutional import Conv2D +from keras.layers.convolutional import MaxPooling2D +from keras.utils import np_utils +from keras.preprocessing.image import ImageDataGenerator +from keras.layers import Dense, Activation, Flatten, Dropout, BatchNormalization +from keras import regularizers +from keras.callbacks import LearningRateScheduler +from keras import backend as K +#import keras +import matplotlib.pyplot as plt +from sklearn.metrics import roc_auc_score + +import boto3 + +import fnmatch, os, os.path, re, time, logging, urllib, json + +from pydub import AudioSegment +from pydub.utils import make_chunks + +import librosa +import librosa.display + +from pydub import AudioSegment +from pydub.utils import make_chunks +import shutil, sys +import os, fnmatch, re +from keras.models import model_from_json + +## inception +import grpc +from tensorflow_serving.apis import predict_pb2 +from tensorflow_serving.apis import prediction_service_pb2_grpc +from keras.preprocessing import image +from tensorflow.python.keras.applications.inception_v3 import * + +model = None +### inception +_use_inception_v3 = False +retrain = False + +bin_dir = os.getcwd() # "../ml_poc/bin/", where the Python script will be running from +home_dir = os.path.dirname(bin_dir) # "../ml_poc/", the home directory +raw_input_dir = os.path.join(home_dir, 'input') # "../ml_poc/input/" +processed_input_dir = os.path.join(raw_input_dir, 'processed') # "../ml_poc/input/processed" +logging_dir = os.path.join(home_dir, 'logs') +split_dir = os.path.join(home_dir, 'split') # "../ml_poc/split" +image_dir = os.path.join(home_dir, 'image') # "../ml_poc/image" +model_dir_name = os.path.join(home_dir, 'model') +output_dir = os.path.join(home_dir, 'output') # "../ml_poc/output" + +# IoT Service Instance: https://23274b22-833f-4c23-abd9-267686b80f75.canary.cp.iot.sap/ +device_id = "MLDevice01" +sensor_id = "ML_Test_Sensor" +sensor_type_alternate_id = "67" +capability_alternate_id = "inf01" + +LOG_FILENAME = 'edge_ml_daemon.log' +logging.basicConfig(filename=os.path.join(logging_dir, LOG_FILENAME), level=logging.INFO) +logging.getLogger().addHandler(logging.StreamHandler(sys.stdout)) +def get_files (d): + includes = ['*.json', '*.h5', '*.mp3', '*.wav', '*.png'] # for files only + excludes = ['.DS_Store','/home/ram/doc'] # for dirs and files + + # transform glob patterns to regular expressions + includes = r'|'.join([fnmatch.translate(x) for x in includes]) + excludes = r'|'.join([fnmatch.translate(x) for x in excludes]) or r'$.' + + # Get all files in the specified directories, excluding subdirectories + files = [f for f in os.listdir(d) if os.path.isfile(os.path.join(d, f))] + + files = [f for f in files if not re.match(excludes, f)] + files = [f for f in files if re.match(includes, f)] + + # Join the directory to each file + files = [os.path.join(d, f) for f in files] + + logging.debug('List of valid files for {}: {}'.format(d, str(files))) + return files + + +def download_data(s3_client, bucket, obj_name, ouput_dir): + try: + s3_client.download_file(bucket, obj_name, ouput_dir+obj_name) + + except Exception as e: + logging.error(str(e) ) + return False + return True + + +### Tool Audio +def create_data(input_data_dir): + x = [] + y = [] + + + tmpx = [] + tmpy = [] + logging.debug('Processing class {}'.format (c)) + + for file in [f for f in os.listdir(input_data_dir + c) if f!='.DS_Store']: + f = raw_input_dir + file + logging.debug("Processing file {}".format(f)) + try: + wave,sr = librosa.load(f, mono=True) + mfcc = librosa.feature.mfcc(y=wave, sr=sr, n_mfcc=20) + mfcc_pad = np.zeros((20, 44)) + mfcc_pad[:mfcc.shape[0], :mfcc.shape[1]] = mfcc[:20, :44] + + if mfcc_pad.shape == (20, 44): + x.append(mfcc_pad) + tmpx.append(mfcc_pad) + y.append(classes.index(c)) + tmpy.append(classes.index(c)) + except Exception as e: + logging.error("Error processing audio file {}".format(str(e)) ) + + return np.array(x), np.array(y) + + +def split_file (sound_file, output_dir): + splits = 0 + if not os.path.exists(output_dir): + logging.debug('creating directory {}'.format(output_dir)) + os.makedirs(output_dir) + + try: + if sound_file.endswith('.wav'): + myaudio = AudioSegment.from_wav(sound_file) + else: + myaudio = AudioSegment.from_mp3(sound_file) + + chunks = make_chunks(myaudio, 1000) + fname = sound_file.split(os.path.sep)[-1] + for i, chunk in enumerate(chunks): + # format of the name: OUTPUT_DIR/FILE_NAMEi.wav + # call fname.split to remove the file type + chunk.export( os.path.join(output_dir, fname.split('.')[0] + str(i)) + '.wav' ) + return len(chunks) + except Exception as e: + logging.error('Error splitting sound file {}: {}'.format(sound_file, str(e))) + return 0 + +def sound_to_image (sound_file, output_dir): + splits = 0 + if not os.path.exists(output_dir): + logging.debug('creating directory {}'.format(output_dir)) + os.makedirs(output_dir) + + try: + fname = sound_file.split('/')[-1] + y, sr = librosa.load(sound_file) + S = librosa.feature.melspectrogram(y, sr=sr, n_mels=128) + log_S = librosa.amplitude_to_db(S) + + spectogram_file = "{}/{}.png".format(output_dir,fname) + librosa.display.specshow(log_S, sr=sr, x_axis='time', y_axis='mel') + #print('Saving spectogram...' + str(sound_file[:sound_file.index('.')] + '.png')) + plt.savefig(spectogram_file) + print("saving image to {}".format(spectogram_file)) + + except Exception as e: + logging.error('Error converting sound file to image {}: {}'.format(sound_file, str(e))) + return 0 + + + fname = sound_file.split('/')[-1] + y, sr = librosa.load(sound_file) + S = librosa.feature.melspectrogram(y, sr=sr, n_mels=128) + log_S = librosa.amplitude_to_db(S) + + spectogram_file = "{}{}.png".format(output_dir,fname) + librosa.display.specshow(log_S, sr=sr, x_axis='time', y_axis='mel') + #print('Saving spectogram...' + str(sound_file[:sound_file.index('.')] + '.png')) + plt.savefig(spectogram_file) + + + + + +def move_file (f, target_dir): + try: + shutil.move(f, os.path.join(target_dir, f.split(os.sep)[-1] ) ) + + except Exception as e: + logging.error('Error moving sound file {}: {}'.format(f, str(e))) + + +def split_files(src, dst, move=False): + print ("Splitting file(s) ..") + start, splits = time.time(), 0 + processed_dir = os.path.join(src, 'processed') + if move: + if not os.path.exists(processed_dir): + logging.debug('creating directory {}'.format(output_dir)) + os.makedirs(processed_dir) + files = get_files(src) + + if len(files): + print ("Splitting file(s) ..") + + for sound_file in files: + logging.debug('Splitting dir {}, file {}'.format(dst, sound_file)) + c = split_file(sound_file, dst) + splits += c + move_file(sound_file, processed_dir) + end = time.time() + duration, count = end - start, len(files) + + if count: + logging.info("\n\nSplit {} files in {} seconds into {} splits. Latency per split: {}".format(count, duration, splits, duration/splits)) + + + +def convert_sound_to_image_files(src, dst, move=False): + + start, splits = time.time(), 0 + processed_dir = os.path.join(src, 'processed') + if move: + if not os.path.exists(processed_dir): + logging.debug('creating directory {}'.format(processed_dir)) + os.makedirs(processed_dir) + files = get_files(src) + if files: + print ("Converting sound to image file(s) ..") + else: + print("No files to run convert") + for sound_file in files: + logging.debug('Converting to image dir {}, file {}'.format(dst, sound_file)) + sound_to_image(sound_file, dst) + + move_file(sound_file, processed_dir) + end = time.time() + duration, count = end - start, len(files) + + if count: + logging.info("\n\Converted {} files in {} seconds. Latency per sound file: {}".format(count, duration, duration//count)) + + + +def run_inferences(src, dst, move=False, inception=False): + try: + output_dir = dst + processed_dir = os.path.join(src, 'processed') + + if not os.path.exists(output_dir): + os.makedirs(output_dir) + + if move: + if not os.path.exists(processed_dir): + os.makedirs(processed_dir) + + files = get_files(src) + if not files: + print("No new files to run inference on") + for sound_file in files: + logging.debug('Writing inference dir {}, file {}'.format(dst, sound_file)) + run_inference(sound_file, dst, inception) + + if move: + move_file(sound_file, processed_dir) + except Exception as e: + print ( 'Error executing inferencing {}'.format( str(e) ) ) + + +def run_inference(test_sound_file, dst, inception=False): + start = time.time() + logging.debug("Running_inference for {}".format( test_sound_file )) + + if not inception: + classes = ['Normal', 'Defect'] + wave,sr = librosa.load(test_sound_file, mono=True) + mfcc = librosa.feature.mfcc(y=wave, sr=sr, n_mfcc=20) + logging.debug('MFCC completed in {} seconds'.format(time.time()-start)) + mfcc_pad = np.zeros((20, 44)) + mfcc_pad[:mfcc.shape[0], :mfcc.shape[1]] = mfcc[:20, :44] + x_pred = mfcc_pad.reshape(-1, 20, 44, 1) + result_index = np.argmax(model.predict(x_pred)) + else: + print ("Running inception model") + ret = do_inference_inceptionv3(test_sound_file) + print ("Submitted to inception V3 ") + #result = classes[result_index] + #print('Sound is {}, inferred in {} seconds\n'.format(result, end-start ) + return ### temperory, this needs to be refactored + end = time.time() + + result = classes[result_index] + resultStr = 'Sound is {}, inferred in {} seconds\n'.format(result, end-start ) + print( resultStr ) + logging.info(resultStr) + print ('--------------------------- .') + #result = test_sound_file.split(os.path.sep)[-1] + "_out.txt" + print (output_dir) + with open( os.path.join(output_dir, test_sound_file.split(os.sep)[-1] + '_out.txt' ), 'a+') as f: + f.write("{},{}\n".format(time.time(), result)) + + send_data_to_gateway( result_index, device_id, sensor_id, sensor_type_alternate_id, capability_alternate_id ) + + + + +def send_data_to_gateway(result, deviceId, sensorAlternateId, sensorTypeAlternateId, capabilityAlternateId): + url = 'http://127.0.0.1:8699/measures/' + deviceId + + measures = [[ int(result) ]] # must be formatted as a list of lists + + body = {} + + body['sensorAlternateId'] = sensorAlternateId + body['sensorTypeAlternateId'] = sensorTypeAlternateId + body['capabilityAlternateId'] = capabilityAlternateId + body['measures'] = measures + + jsonbody = json.dumps(body) + + request = urllib.request.Request(url, jsonbody.encode('utf-8'), { "Content-Type": "application/json" }) + response = urllib.request.urlopen(request).read().decode() + print(response) + + + +def do_forever(): + global model + logging.info('Running Edge Machine Learning PoC from {}'.format(home_dir)) + while True: + model = load_model(model_dir_name) + + split_files(raw_input_dir, split_dir, move=True) + time.sleep(2) + + if _use_inception_v3: + convert_sound_to_image_files(split_dir, image_dir, move=True) + + #run_inferences(split_dir, output_dir, move=True, inception=False) + run_inferences(image_dir, output_dir, move=True, inception=_use_inception_v3) + #ret = do_inference_inceptionv3('/home/jovyan/work/data/images/Defect_01_11.png', FLAGS.server) + +def get_aws_connection(): + boto3.setup_default_session(region_name='eu-central-1') + s3 = boto3.client( + 's3', + # Hard coded strings as credentials, not recommended. + aws_access_key_id='', + aws_secret_access_key='', + region_name = 'eu-central-1' + ) + return s3 + +def download_model(s3_client, bucket, obj_name, ouput_dir): + try: + + print ("obj name -->" +obj_name) + print ("output_dir ->" + ouput_dir) + s3_client.download_file(bucket, obj_name, ouput_dir+os.sep+obj_name.split(os.sep)[-1]) + + except Exception as e: + print(str(e)) + return False + return True + +## load model +last_model_download_ts = int(time.time()) +def load_model (model_dir_name, model_name = 'audio_detectx', model_expiry_seconds=300): + global model, last_model_download_ts + print ( 'Fresh model download will attempted in {} seconds'.format ( last_model_download_ts+model_expiry_seconds - int(time.time()), str(model) ) ) + if model is None or (int(time.time()) > last_model_download_ts+model_expiry_seconds): + ### attempt to download model from S3 + try: + conn = get_aws_connection() + print ("Downloading model/{}.json".format(model_name)) + download_model(conn, 'edgepoc', "model/{}.json".format(model_name), model_dir_name) + print ("Downloading model/{}.h5".format(model_name) ) + download_model(conn, 'edgepoc', "model/{}.h5".format(model_name), model_dir_name) + print ("Downloading model complete") + last_model_download_ts = int(time.time()) + except Exception as e: + print(str(e)) + + print ("Loading model {}".format(model_name)) + # Model reconstruction from JSON file + with open(os.path.join(model_dir_name, "{}.json".format(model_name)), 'r') as f: + model = model_from_json(f.read()) + + # Load weights into the new model + model.load_weights(os.path.join(model_dir_name, "{}.h5".format(model_name))) + else: + print("Using cached model ") + return model + +#### INCEPTION V3 stuff +tf.app.flags.DEFINE_integer('concurrency', 1, + 'maximum number of concurrent inference requests') + +tf.app.flags.DEFINE_string('server', '', 'PredictionService host:port') +FLAGS = tf.app.flags.FLAGS +preds = {0: 'Normal', 1:'Defect'} +GRPC_HOST_PORT = 'localhost:8500' + +def _callback(result_future): + """Callback function. + Calculates the statistics for the prediction result. + Args: + result_future: Result future of the RPC. + """ + print ("\nInference scores {}".format(result_future.result().outputs['scores']) ) + exception = result_future.exception() + if exception: + print(exception) + else: + #print("From Callback",result_future.result().outputs['dense_2/Softmax:0']) + response = np.array( + result_future.result().outputs['scores'].float_val) + #print("Probabilties {} Prediction {}".format(str(response), preds[np.argmax(response)] ) ) + print("Prediction {}".format(preds[np.argmax(response)] ) ) + +request, stub = None, None +def init(hostport): + global request, stub + if not request: + print ("Starting tf client") + start = time.time() + channel = grpc.insecure_channel(hostport) + stub = prediction_service_pb2_grpc.PredictionServiceStub(channel) + request = predict_pb2.PredictRequest() + request.model_spec.name = 'Volvo01' + request.model_spec.signature_name = 'predict_images' #'extract_feature' #'predict_images' + print ("Initialized client in {} seconds".format(time.time()-start)) + else: + print ("TF client already started") + return stub, request + +def preprocess_input(x): + x /= 255. + x -= 0.5 + x *= 2. + return x + +def do_inference_inceptionv3(file_path, dst=None, hostport=GRPC_HOST_PORT): + stub, request = init(hostport) + + # For loading images + image_size = 299 + img = image.load_img(file_path, target_size=( 299, 299), color_mode='rgb') + + x = image.img_to_array(img) + x = np.expand_dims(x, axis=0) + x = preprocess_input(x) + #print(x.shape) + + request.inputs['images'].CopyFrom( + tf.contrib.util.make_tensor_proto(x, shape=[1, image_size, image_size, 3])) + + start = time.time() + result_future = stub.Predict.future(request, 10.25) # 5 seconds + result_future.add_done_callback(_callback) + end = time.time() + print("Time to Send is ",time.time() - start) + +#### INCEPTION V3 stuff + + +do_forever() + +time.sleep(10) \ No newline at end of file diff --git a/edge-ml-welding-sound/src/edge/src/main/resources/model/audio_class.h5 b/edge-ml-welding-sound/src/edge/src/main/resources/model/audio_class.h5 new file mode 100644 index 0000000..b0fc057 Binary files /dev/null and b/edge-ml-welding-sound/src/edge/src/main/resources/model/audio_class.h5 differ diff --git a/edge-ml-welding-sound/src/edge/src/main/resources/model/audio_class.json b/edge-ml-welding-sound/src/edge/src/main/resources/model/audio_class.json new file mode 100644 index 0000000..1afa50e --- /dev/null +++ b/edge-ml-welding-sound/src/edge/src/main/resources/model/audio_class.json @@ -0,0 +1 @@ +{"class_name": "Sequential", "config": {"name": "sequential_5", "layers": [{"class_name": "Conv2D", "config": {"name": "conv2d_13", "trainable": true, "batch_input_shape": [null, 20, 44, 1], "dtype": "float32", "filters": 32, "kernel_size": [3, 3], "strides": [1, 1], "padding": "same", "data_format": "channels_last", "dilation_rate": [1, 1], "activation": "linear", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": {"class_name": "L1L2", "config": {"l1": 0.0, "l2": 9.999999747378752e-05}}, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}}, {"class_name": "Activation", "config": {"name": "activation_13", "trainable": true, "dtype": "float32", "activation": "relu"}}, {"class_name": "BatchNormalization", "config": {"name": "batch_normalization_13", "trainable": true, "dtype": "float32", "axis": -1, "momentum": 0.99, "epsilon": 0.001, "center": true, "scale": true, "beta_initializer": {"class_name": "Zeros", "config": {}}, "gamma_initializer": {"class_name": "Ones", "config": {}}, "moving_mean_initializer": {"class_name": "Zeros", "config": {}}, "moving_variance_initializer": {"class_name": "Ones", "config": {}}, "beta_regularizer": null, "gamma_regularizer": null, "beta_constraint": null, "gamma_constraint": null}}, {"class_name": "MaxPooling2D", "config": {"name": "max_pooling2d_13", "trainable": true, "dtype": "float32", "pool_size": [2, 2], "padding": "valid", "strides": [2, 2], "data_format": "channels_last"}}, {"class_name": "Dropout", "config": {"name": "dropout_13", "trainable": true, "dtype": "float32", "rate": 0.2, "noise_shape": null, "seed": null}}, {"class_name": "Conv2D", "config": {"name": "conv2d_14", "trainable": true, "batch_input_shape": [null, 20, 44, 1], "dtype": "float32", "filters": 32, "kernel_size": [3, 3], "strides": [1, 1], "padding": "same", "data_format": "channels_last", "dilation_rate": [1, 1], "activation": "linear", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": {"class_name": "L1L2", "config": {"l1": 0.0, "l2": 9.999999747378752e-05}}, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}}, {"class_name": "Activation", "config": {"name": "activation_14", "trainable": true, "dtype": "float32", "activation": "relu"}}, {"class_name": "BatchNormalization", "config": {"name": "batch_normalization_14", "trainable": true, "dtype": "float32", "axis": -1, "momentum": 0.99, "epsilon": 0.001, "center": true, "scale": true, "beta_initializer": {"class_name": "Zeros", "config": {}}, "gamma_initializer": {"class_name": "Ones", "config": {}}, "moving_mean_initializer": {"class_name": "Zeros", "config": {}}, "moving_variance_initializer": {"class_name": "Ones", "config": {}}, "beta_regularizer": null, "gamma_regularizer": null, "beta_constraint": null, "gamma_constraint": null}}, {"class_name": "MaxPooling2D", "config": {"name": "max_pooling2d_14", "trainable": true, "dtype": "float32", "pool_size": [2, 2], "padding": "valid", "strides": [2, 2], "data_format": "channels_last"}}, {"class_name": "Dropout", "config": {"name": "dropout_14", "trainable": true, "dtype": "float32", "rate": 0.2, "noise_shape": null, "seed": null}}, {"class_name": "Conv2D", "config": {"name": "conv2d_15", "trainable": true, "batch_input_shape": [null, 20, 44, 1], "dtype": "float32", "filters": 32, "kernel_size": [3, 3], "strides": [1, 1], "padding": "same", "data_format": "channels_last", "dilation_rate": [1, 1], "activation": "linear", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": {"class_name": "L1L2", "config": {"l1": 0.0, "l2": 9.999999747378752e-05}}, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}}, {"class_name": "Activation", "config": {"name": "activation_15", "trainable": true, "dtype": "float32", "activation": "relu"}}, {"class_name": "BatchNormalization", "config": {"name": "batch_normalization_15", "trainable": true, "dtype": "float32", "axis": -1, "momentum": 0.99, "epsilon": 0.001, "center": true, "scale": true, "beta_initializer": {"class_name": "Zeros", "config": {}}, "gamma_initializer": {"class_name": "Ones", "config": {}}, "moving_mean_initializer": {"class_name": "Zeros", "config": {}}, "moving_variance_initializer": {"class_name": "Ones", "config": {}}, "beta_regularizer": null, "gamma_regularizer": null, "beta_constraint": null, "gamma_constraint": null}}, {"class_name": "MaxPooling2D", "config": {"name": "max_pooling2d_15", "trainable": true, "dtype": "float32", "pool_size": [2, 2], "padding": "valid", "strides": [2, 2], "data_format": "channels_last"}}, {"class_name": "Dropout", "config": {"name": "dropout_15", "trainable": true, "dtype": "float32", "rate": 0.2, "noise_shape": null, "seed": null}}, {"class_name": "Conv2D", "config": {"name": "conv2d_16", "trainable": true, "batch_input_shape": [null, 20, 44, 1], "dtype": "float32", "filters": 32, "kernel_size": [3, 3], "strides": [1, 1], "padding": "same", "data_format": "channels_last", "dilation_rate": [1, 1], "activation": "linear", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": {"class_name": "L1L2", "config": {"l1": 0.0, "l2": 9.999999747378752e-05}}, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}}, {"class_name": "Activation", "config": {"name": "activation_16", "trainable": true, "dtype": "float32", "activation": "relu"}}, {"class_name": "BatchNormalization", "config": {"name": "batch_normalization_16", "trainable": true, "dtype": "float32", "axis": -1, "momentum": 0.99, "epsilon": 0.001, "center": true, "scale": true, "beta_initializer": {"class_name": "Zeros", "config": {}}, "gamma_initializer": {"class_name": "Ones", "config": {}}, "moving_mean_initializer": {"class_name": "Zeros", "config": {}}, "moving_variance_initializer": {"class_name": "Ones", "config": {}}, "beta_regularizer": null, "gamma_regularizer": null, "beta_constraint": null, "gamma_constraint": null}}, {"class_name": "MaxPooling2D", "config": {"name": "max_pooling2d_16", "trainable": true, "dtype": "float32", "pool_size": [2, 2], "padding": "valid", "strides": [2, 2], "data_format": "channels_last"}}, {"class_name": "Dropout", "config": {"name": "dropout_16", "trainable": true, "dtype": "float32", "rate": 0.2, "noise_shape": null, "seed": null}}, {"class_name": "Flatten", "config": {"name": "flatten_4", "trainable": true, "dtype": "float32", "data_format": "channels_last"}}, {"class_name": "Dense", "config": {"name": "dense_4", "trainable": true, "dtype": "float32", "units": 7, "activation": "softmax", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": null, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}}]}, "keras_version": "2.3.0", "backend": "tensorflow"} \ No newline at end of file diff --git a/predictive-pmml/LICENSE b/predictive-pmml/LICENSE new file mode 100644 index 0000000..8973bf7 --- /dev/null +++ b/predictive-pmml/LICENSE @@ -0,0 +1,41 @@ +SAP SAMPLE CODE LICENSE AGREEMENT + +Please scroll down and read the following SAP Sample Code License Agreement carefully ("Agreement"). By downloading, installing, or otherwise using the SAP sample code or any materials that accompany the sample code documentation (collectively, the "Sample Code"), You agree that this Agreement forms a legally binding agreement between You ("You" or "Your") and SAP SE, for and on behalf of itself and its subsidiaries and affiliates (as defined in Section 15 of the German Stock Corporation Act), and You agree to be bound by all of the terms and conditions stated in this Agreement. If You are trying to access or download the Sample Code on behalf of Your employer or as a consultant or agent of a third party (either "Your Company"), You represent and warrant that You have the authority to act on behalf of and bind Your Company to the terms of this Agreement and everywhere in this Agreement that refers to 'You' or 'Your' shall also include Your Company. If You do not agree to these terms, do not attempt to access or use the Sample Code. + +1. LICENSE: Subject to the terms of this Agreement, SAP grants You a non-exclusive, non-transferable, non-sublicensable, revocable, royalty-free, limited license to use, copy, and modify the Sample Code solely for Your internal business purposes. + +2. RESTRICTIONS: You must not use the Sample Code to: (a) impair, degrade or reduce the performance or security of any SAP products, services or related technology (collectively, "SAP Products"); (b) enable the bypassing or circumventing of SAP's license restrictions and/or provide users with access to the SAP Products to which such users are not licensed; or (c) permit mass data extraction from an SAP Product to a non-SAP Product, including use, modification, saving or other processing of such data in the non-SAP Product. Further, You must not: (i) provide or make the Sample Code available to any third party other than your authorized employees, contractors and agents (collectively, “Representatives”) and solely to be used by Your Representatives for Your own internal business purposes; ii) remove or modify any marks or proprietary notices from the Sample Code; iii) assign this Agreement, or any interest therein, to any third party; (iv) use any SAP name, trademark or logo without the prior written authorization of SAP; or (v) use the Sample Code to modify an SAP Product or decompile, disassemble or reverse engineer an SAP Product (except to the extent permitted by applicable law). You are responsible for any breach of the terms of this Agreement by You or Your Representatives. + +3. INTELLECTUAL PROPERTY: SAP or its licensors retain all ownership and intellectual property rights in and to the Sample Code and SAP Products. In exchange for the right to use, copy and modify the Sample Code provided under this Agreement, You covenant not to assert any intellectual property rights in or to any of Your products, services, or related technology that are based on or incorporate the Sample Code against any individual or entity in respect of any current or future SAP Products. + +4. SAP AND THIRD PARTY APIS: The Sample Code may include API (application programming interface) calls to SAP and third-party products or services. The access or use of the third-party products and services to which the API calls are directed may be subject to additional terms and conditions between you and SAP or such third parties. You (and not SAP) are solely responsible for understanding and complying with any additional terms and conditions that apply to the access or use of those APIs and/or third-party products and services. SAP does not grant You any rights in or to these APIs, products or services under this Agreement. + +5. FREE AND OPEN SOURCE COMPONENTS: The Sample Code may include third party free or open source components ("FOSS Components"). You may have additional rights in such FOSS Components that are provided by the third party licensors of those components. + +6. THIRD PARTY DEPENDENCIES: The Sample Code may require third party software dependencies ("Dependencies") for the use or operation of the Sample Code. These Dependencies may be identified by SAP in Maven POM files, documentation or by other means. SAP does not grant You any rights in or to such Dependencies under this Agreement. You are solely responsible for the acquisition, installation and use of such Dependencies. + +7. WARRANTY: +a) If You are located outside the US or Canada: AS THE SAMPLE CODE IS PROVIDED TO YOU FREE OF CHARGE, SAP DOES NOT GUARANTEE OR WARRANT ANY FEATURES OR QUALITIES OF THE SAMPLE CODE OR GIVE ANY UNDERTAKING WITH REGARD TO ANY OTHER QUALITY. NO SUCH WARRANTY OR UNDERTAKING SHALL BE IMPLIED BY YOU FROM ANY DESCRIPTION IN THE SAMPLE CODE OR ANY OTHER MATERIALS, COMMUNICATION OR ADVERTISEMENT. IN PARTICULAR, SAP DOES NOT WARRANT THAT THE SAMPLE CODE WILL BE AVAILABLE UNINTERRUPTED, ERROR FREE, OR PERMANENTLY AVAILABLE. ALL WARRANTY CLAIMS RESPECTING THE SAMPLE CODE ARE SUBJECT TO THE LIMITATION OF LIABILITY STIPULATED IN SECTION 8 BELOW. +b) If You are located in the US or Canada: THE SAMPLE CODE IS LICENSED TO YOU "AS IS", WITHOUT ANY WARRANTY, ESCROW, TRAINING, MAINTENANCE, OR SERVICE OBLIGATIONS WHATSOEVER ON THE PART OF SAP. SAP MAKES NO EXPRESS OR IMPLIED WARRANTIES OR CONDITIONS OF SALE OF ANY TYPE WHATSOEVER, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE. IN PARTICULAR, SAP DOES NOT WARRANT THAT THE SAMPLE CODE WILL BE AVAILABLE UNINTERRUPTED, ERROR FREE, OR PERMANENTLY AVAILABLE. YOU ASSUME ALL RISKS ASSOCIATED WITH THE USE OF THE SAMPLE CODE, INCLUDING WITHOUT LIMITATION RISKS RELATING TO QUALITY, AVAILABILITY, PERFORMANCE, DATA LOSS, AND UTILITY IN A PRODUCTION ENVIRONMENT. +c) For all locations: SAP DOES NOT MAKE ANY REPRESENTATIONS OR WARRANTIES IN RESPECT OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE. IN PARTICULAR, SAP DOES NOT WARRANT THAT THIRD-PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES WILL BE AVAILABLE, ERROR FREE, INTEROPERABLE WITH THE SAMPLE CODE, SUITABLE FOR ANY PARTICULAR PURPOSE OR NON-INFRINGING. YOU ASSUME ALL RISKS ASSOCIATED WITH THE USE OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES, INCLUDING WITHOUT LIMITATION RISKS RELATING TO QUALITY, AVAILABILITY, PERFORMANCE, DATA LOSS, UTILITY IN A PRODUCTION ENVIRONMENT, AND NON-INFRINGEMENT. IN NO EVENT WILL SAP BE LIABLE DIRECTLY OR INDIRECTLY IN RESPECT OF ANY USE OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES BY YOU. + +8. LIMITATION OF LIABILITY: +a) If You are located outside the US or Canada: IRRESPECTIVE OF THE LEGAL REASONS, SAP SHALL ONLY BE LIABLE FOR DAMAGES UNDER THIS AGREEMENT IF SUCH DAMAGE (I) CAN BE CLAIMED UNDER THE GERMAN PRODUCT LIABILITY ACT OR (II) IS CAUSED BY INTENTIONAL MISCONDUCT OF SAP OR (III) CONSISTS OF PERSONAL INJURY. IN ALL OTHER CASES, NEITHER SAP NOR ITS EMPLOYEES, AGENTS AND SUBCONTRACTORS SHALL BE LIABLE FOR ANY KIND OF DAMAGE OR CLAIMS HEREUNDER. +b) If You are located in the US or Canada: IN NO EVENT SHALL SAP BE LIABLE TO YOU, YOUR COMPANY OR TO ANY THIRD PARTY FOR ANY DAMAGES IN AN AMOUNT IN EXCESS OF $100 ARISING IN CONNECTION WITH YOUR USE OF OR INABILITY TO USE THE SAMPLE CODE OR IN CONNECTION WITH SAP'S PROVISION OF OR FAILURE TO PROVIDE SERVICES PERTAINING TO THE SAMPLE CODE, OR AS A RESULT OF ANY DEFECT IN THE SAMPLE CODE. THIS DISCLAIMER OF LIABILITY SHALL APPLY REGARDLESS OF THE FORM OF ACTION THAT MAY BE BROUGHT AGAINST SAP, WHETHER IN CONTRACT OR TORT, INCLUDING WITHOUT LIMITATION ANY ACTION FOR NEGLIGENCE. YOUR SOLE REMEDY IN THE EVENT OF BREACH OF THIS AGREEMENT BY SAP OR FOR ANY OTHER CLAIM RELATED TO THE SAMPLE CODE SHALL BE TERMINATION OF THIS AGREEMENT. NOTWITHSTANDING ANYTHING TO THE CONTRARY HEREIN, UNDER NO CIRCUMSTANCES SHALL SAP OR ITS LICENSORS BE LIABLE TO YOU OR ANY OTHER PERSON OR ENTITY FOR ANY SPECIAL, INCIDENTAL, CONSEQUENTIAL, OR INDIRECT DAMAGES, LOSS OF GOOD WILL OR BUSINESS PROFITS, WORK STOPPAGE, DATA LOSS, COMPUTER FAILURE OR MALFUNCTION, ANY AND ALL OTHER COMMERCIAL DAMAGES OR LOSS, OR EXEMPLARY OR PUNITIVE DAMAGES. + +9. INDEMNITY: You will fully indemnify, hold harmless and defend SAP against law suits based on any claim: (a) that any of Your products, services or related technology that are based on or incorporate the Sample Code infringes or misappropriates any patent, copyright, trademark, trade secrets, or other proprietary rights of a third party, or (b) related to Your alleged violation of the terms of this Agreement. + +10. EXPORT: The Sample Code is subject to German, EU and US export control regulations. You confirm that: a) You will not use the Sample Code for, and will not allow the Sample Code to be used for, any purposes prohibited by German, EU and US law, including, without limitation, for the development, design, manufacture or production of nuclear, chemical or biological weapons of mass destruction; b) You are not located in Cuba, Iran, Sudan, Iraq, North Korea, Syria, nor any other country to which the United States has prohibited export or that has been designated by the U.S. Government as a "terrorist supporting" country (any, an "US Embargoed Country"); c) You are not a citizen, national or resident of, and are not under the control of, a US Embargoed Country; d) You will not download or otherwise export or re-export the Sample Code, directly or indirectly, to a US Embargoed Country nor to citizens, nationals or residents of a US Embargoed Country; e) You are not listed on the United States Department of Treasury lists of Specially Designated Nationals, Specially Designated Terrorists, and Specially Designated Narcotic Traffickers, nor listed on the United States Department of Commerce Table of Denial Orders or any other U.S. government list of prohibited or restricted parties and f) You will not download or otherwise export or re-export the Sample Code, directly or indirectly, to persons on the above-mentioned lists. + +11. SUPPORT: SAP does not offer support for the Sample Code. + +12. TERM AND TERMINATION: You may terminate this Agreement by destroying all copies of the Sample Code in Your possession or control. SAP may terminate Your license to use the Sample Code immediately if You fail to comply with any of the terms of this Agreement, or, for SAP's convenience by providing you with ten (10) days written notice of termination. In case of termination or expiration of this Agreement, You must immediately destroy all copies of the Sample Code in your possession or control. In the event Your Company is acquired (by merger, purchase of stock, assets or intellectual property or exclusive license), or You become employed, by a direct competitor of SAP, then this Agreement and all licenses granted to You in this Agreement shall immediately terminate upon the date of such acquisition or change of employment. + +13. LAW/VENUE: +a) If You are located outside the US or Canada: This Agreement is governed by and construed in accordance with the laws of Germany without reference to its conflicts of law principles. You and SAP agree to submit to the exclusive jurisdiction of, and venue in, the courts located in Karlsruhe, Germany in any dispute arising out of or relating to this Agreement or the Sample Code. The United Nations Convention on Contracts for the International Sale of Goods shall not apply to this Agreement. +b) If You are located in the US or Canada: This Agreement shall be governed by and construed in accordance with the laws of the State of New York, USA without reference to its conflicts of law principles. You and SAP agree to submit to the exclusive jurisdiction of, and venue in, the courts located in New York, New York, USA in any dispute arising out of or relating to this Agreement or the Sample Code. The United Nations Convention on Contracts for the International Sale of Goods shall not apply to this Agreement. + +14. MISCELLANEOUS: This Agreement is the complete agreement between the parties respecting the Sample Code. This Agreement supersedes all prior or contemporaneous agreements or representations with regards to the Sample Code. If any term of this Agreement is found to be invalid or unenforceable, the surviving provisions shall remain effective. SAP's failure to enforce any right or provisions stipulated in this Agreement will not constitute a waiver of such provision, or any other provision of this Agreement. + + +v1.0-071618 diff --git a/predictive-pmml/NOTICE b/predictive-pmml/NOTICE new file mode 100644 index 0000000..96ceaac --- /dev/null +++ b/predictive-pmml/NOTICE @@ -0,0 +1 @@ +Copyright (c) 2020 SAP SE or an SAP affiliate company. All rights reserved. diff --git a/predictive-pmml/README.md b/predictive-pmml/README.md new file mode 100644 index 0000000..0f8af9c --- /dev/null +++ b/predictive-pmml/README.md @@ -0,0 +1,244 @@ +# Predictive Service (PMML) Sample + +## Overview +The implemented scenario is documented [here](https://blogs.sap.com/2019/11/05/implement-predictive-analytics-at-the-edge/) + +## Product Documentation + +Product Documentation for SAP Edge Services is available as follows: + +[SAP Edge Services, cloud edition](https://help.sap.com/viewer/p/EDGE_SERVICES) + +### Description + +On an interval, this execute a prediction based on the KNN algorithm, to identify if the measured color is an expected color. + +The predictions (both punctual and a global index) values are then fed back into the IoT Services Gateway Edge via REST as a different capability. This capability is then visible in the IoT Services Cockpit. + +### Deploying this sample + +This sample is packaged as an OSGI bundle. It is deployed to SAP Edge Services, cloud edition using a Custom Service defined within the Policy Service of SAP Edge Services. + +## Requirements + +The following must be installed for this sample: +1. Java JDK 1.8 or above (https://www.java.com/en/download/) +2. Apache Maven (https://maven.apache.org/download.cgi) +3. Git command line tool (https://git-scm.com/downloads) +4. SAP Edge Services (Cloud or On-premise edition) +4. Java PMML Libraries + +### SAP Edge Services, cloud edition + +For cloud edition, a working IoT Services Gateway Edge (REST) is required, with the SAP Edge Services Persistence Service installed. + +The following needs to be setup on IoT Services as a data model for the sample to permit the PMML model to analyze correctly the data and to send the results back into the system. To create the entries, login to the IoT Services cockpit with the same tenant that your gateway uses. + +1. Create the capabilities +- **capabilityAlternateId:** color +- **properties:** + +| Property Name | Property Type | +|:-------------: |:-------------: | +| R | float | +| G | float | +| B | float | +--- +- **capabilityAlternateId:** color prediction +- **properties:** + +| Property Name | Property Type | +|:-------------: |:-------------: | +| label | string | +| neighbor1 | float | +| neighbor2 | float | +| neighbor3 | float | +--- +- **capabilityAlternateId:** validity color score +- **properties:** + +| Property Name | Property Type | +|:-------------: |:-------------: | +| index | float | + +2. Create the sensor type +- **sensorType name:** color sensor type +- **sensorTypeAlternateId:** 255 + +3. Add all the capabilities into the **_color sensor type_** Sensor Type + +## Download and Installation + +### Download the sample app +```json +git clone https://github.com/SAP/iot-edge-services-samples.git +cd iot-edge-services-samples +cd predictive-model-pmml +``` + +### Download the SAP Edge Service dependencies bundles and add to Maven + +#### SAP Edge Services Persistence Service + +1. Ensure that from the Policy Service, the Persistence Service is installed on your gateway. +2. Access the files of the device running the IoT Services Gateway Edge +3. cd /gateway_folder/custombundles +4. Copy the file PersistenceService-3.1912.0.jar to the project root of this sample. + NOTE: the version number may change, in which case, the version number in the pom.xml file will need to be updated +5. From root directory of this sample, execute the following command: +```json +mvn install:install-file -Dfile=PersistenceService-3.1912.0.jar -DgroupId=com.sap.iot.edgeservices -DartifactId=PersistenceService -Dversion=3.1912.0 -Dpackaging=jar +``` + NOTE: if the version number has changed, substitute 3.1912.0 in the above command for the appropriate version number as found in the filename. + +#### SAP Edge Services Configuration Service + +1. Ensure that from the Policy Service, the Persistence Service is installed on your gateway. +2. Access the files of the device running the IoT Services Gateway Edge +3. cd /gateway_folder/custombundles +4. Copy the file ConfigService-3.1912.0.jar to the project root of this sample. + NOTE: the version number may change, in which case, the version number in the pom.xml file will need to be updated +5. From root directory of this sample, execute the following command: +```json +mvn install:install-file -Dfile=ConfigService-3.1912.0.jar -DgroupId=com.sap.iot.edgeservices -DartifactId=ConfigService -Dversion=3.1912.0 -Dpackaging=jar +``` + NOTE: if the version number has changed, substitute 3.1912.0 in the above command for the appropriate version number as found in the filename. + +### Customize the source + +You can change the pmml model and some configuration parameters dynamically. +Open the file +src\main\resources\defaultConfiguration.json +create and deploy a new configuration for this service within the Policy Service. +in the body of the configuration put a JSON that contains the parameter that you would like to change. The change is not incremental. + +#### SAP Edge Services, cloud edition + +By default, the sample works directly with SAP Edge Services, cloud edition and nothing needs to be changed. + +#### SAP Edge Services, on-premise edition + +This example, with some modifications, could works with SAP Edge Services, on-premise edition. Some of them are already controlled with the flag _**CLOUD_EDGE_SERVICES**_ + +Edit the file +```json +src\main\java\com\sap\iot\edgeservices\predictive\sample\custom\PredictValue.java +``` +In the definition, set CLOUD_EDGE_SERVICES = false (line 49) +```json + private static Boolean CLOUD_EDGE_SERVICES = false; //SET TO false for ON-PREMISE +``` + +### Compile and Package + +1. Open a shell / command prompt (on Windows as Administrator) and navigate to the `predictive-pmml` directory. +2. Edit the provided pom.xml and ensure that the version number of the Persistence Service and ConfigService jar files matches the JSON. If it does not match, change the numbers in the pom.xml +```json + + com.sap.iot.edgeservices + PersistenceService + 3.1912.0 + provided + + + com.sap.iot.edgeservices + ConfigService + 3.1912.0 + provided + +``` +3. Run following command to compile and build the package: +```json +mvn clean install +``` +4. Verify that the file PredictiveModel-1.0.0.jar was created in the /target folder. + +### Satisfy the dependencies + +The following inherited dependencies must be satisfied by installing the OSGi versions of the following jar files: +- pmml-agent-1.4.13.jar +- org.eclipse.persistence.core-2.7.4.jar +- org.eclipse.persistence.moxy-2.7.4.jar +- jaxb-osgi-2.3.2.jar +- pmml-model-1.4.13.jar +- pmml-model-metro-1.4.13.jar +- pmml-model-moxy-1.4.13.jar +- commons-math3-3.6.1.jar +- pmml-evaluator-1.4.13.jar + + +### Deploy + +#### SAP Edge Services, cloud edition + +1. Use the SAP Edge Services Policy service, navigate to the Services list and create a new custom service. +2. Use "RGBSERVICE" for the event topic field (or what you have defined at line 56 of the file src\main\java\com\sap\iot\edgeservices\predictive\sample\PredictiveModule.java +3. Use the file /target/PredictiveModel-1.0.0.jar file. +4. Save it. +5. Go in the Gateways and Group of Gateways list and search for your gateway in the list +6. Deploy the created custom service + +### Deploy Configurations + +If needed you can create and use a custom configuration for the service within the Policy Service. The body of the configuration is a JSON object; these are the default values: +```json +{ + "predictionSensorTypeAlternateId": "255", + "capabilityAlternateId": "color", + "predictionSensorAlternateId": "color sensor", + "predictionCapabilityAlternateId": "color prediction", + "predictionIndexCapabilityAlternateId": "validity color score", + "edgePlatformRestEndpoint": "http://localhost:8699/measures/", + "plantColorOutOfRangeLimit": "100", + "plantScalingForOutOfRange": "1.25", + "analysisFrequency": 10000, + "pmmlFileContentAsString": "" +} +``` +If a new configuration is uploaded the old configuration is discarded (it's not incremental). The unspecified values are replaced with the default values. + +## Run + +### SAP Edge Services, cloud edition + +1. Use a supported method to send data to IoT Services Gateway Edge. For example, send data to the SAP IoT Services Gateway Edge using a tool like Postman. +```json +URL: http://:8699/measures/colordevice +HEADERS: Content-type: application/json +BODY: { + "capabilityAlternateId": "color", + "sensorTypeAlternateId": "255", + "sensorAlternateId": "color sensor", + "measures": [{ + "R": "235", + "G": "64", + "B": "52" + }] + } +``` +To actually see the predicted values created correctly, read the measurements inside the other capabilities. + +2. Login to the IoT Services Cockpit +3. Navigate to your gateway +4. select the device for color device +5. graph the results for +```json +sensorAlternateId: color sensor +capabilityAlternateId: color prediction +``` +and +```json +sensorAlternateId: color sensor +capabilityAlternateId: validity color score +``` + +## How to obtain support + +These samples are provided "as-is" basis with detailed documentation on how to use them. + + +## Copyright and License + +Copyright (c) 2020 SAP SE or an SAP affiliate company. All rights reserved. + +License provided by [SAP SAMPLE CODE LICENSE AGREEMENT](https://github.com/SAP-samples/iot-edge-services-samples/blob/master/predictive-pmml/LICENSE) diff --git a/predictive-pmml/pom.xml b/predictive-pmml/pom.xml new file mode 100644 index 0000000..2c9df77 --- /dev/null +++ b/predictive-pmml/pom.xml @@ -0,0 +1,141 @@ + + + 4.0.0 + com.sap.iot.edgeservices.predictive.sample + PredictiveModel + 1.0.0 + + ${project.artifactId} + ${project.version} + + + + com.sap.iot.edgeservices + PersistenceService + 3.1912.0 + bundle + provided + + + com.sap.iot.edgeservices + ConfigService + 3.1912.0 + bundle + provided + + + org.osgi + org.osgi.core + 4.3.1 + provided + + + org.osgi + org.osgi.service.component.annotations + 1.3.0 + + + org.apache.logging.log4j + log4j-osgi + 2.9.0 + provided + + + org.slf4j + slf4j-api + 1.7.29 + provided + + + commons-lang + commons-lang + 2.6 + provided + + + org.osgi + org.osgi.service.event + 1.4.0 + provided + + + org.apache.commons + commons-math3 + 3.6.1 + compile + + + org.jpmml + pmml-evaluator + 1.4.13 + bundle + provided + + + commons-io + commons-io + 2.2 + provided + + + com.fasterxml.jackson.core + jackson-databind + 2.10.1 + + + + + + org.apache.maven.plugins + maven-compiler-plugin + 3.3 + + 1.8 + 1.8 + + + + org.apache.maven.plugins + maven-jar-plugin + + + META-INF/MANIFEST.MF + + + + + + org.apache.felix + maven-bundle-plugin + true + + + + scr-metadata + + manifest + + + true + + + + + true + META-INF + + ${bundle.symbolicName} + + <_dsannotations>* + + <_metatypeannotations>* + + + + + + diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/Calculation.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/Calculation.java new file mode 100644 index 0000000..3f77f7b --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/Calculation.java @@ -0,0 +1,41 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample; + +import com.sap.iot.edgeservices.predictive.sample.db.PersistenceClient; + +public abstract class Calculation +implements Runnable { + + protected State state = State.NOT_INITIALIZED; // bundle state + protected final PersistenceClient persistenceClient; // persistence client used by the bundle + + //////////////////// + // constructors + //////////////////// + + public Calculation(PersistenceClient persistenceClient) { + this.persistenceClient = persistenceClient; + initialize(); + } + + //////////////////// + // public abstract functions + //////////////////// + + public abstract void stopGracefully(); + + //////////////////// + // protected abstract functions + //////////////////// + + protected abstract void initialize(); + + public enum State { + NOT_INITIALIZED, + RUNNING + } + +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/Engine.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/Engine.java new file mode 100644 index 0000000..f9415b4 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/Engine.java @@ -0,0 +1,87 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample; + +import java.util.concurrent.Executors; +import java.util.concurrent.ScheduledExecutorService; +import java.util.concurrent.ScheduledFuture; +import java.util.concurrent.TimeUnit; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +/** + * The Engine will continuously run the calculation + */ +public class Engine +extends Thread { + + private static final Logger LOGGER = LoggerFactory.getLogger(Engine.class); // logger + + //////////////////// + // class fields + //////////////////// + + private final Calculation calculation; // the parent interface to the calculation class + private final ScheduledExecutorService threadPool = Executors.newScheduledThreadPool(1); // thread scheduler for the + // calculation + private ScheduledFuture thread; // current calculation + private long calculationFrequencyMS; // scheduler frequency + + //////////////////// + // Constructors + //////////////////// + + /** + * Ctor for the engine. + * + * @param calculation + * calculation object + * @param calculationFrequencyMS + * thread frequency + */ + Engine(Calculation calculation, long calculationFrequencyMS) { + LOGGER.debug("ctor - called"); + this.calculation = calculation; + this.calculationFrequencyMS = calculationFrequencyMS; + initialize(); + } + + //////////////////// + // Public methods + //////////////////// + + @Override + public void run() { + LOGGER.info("run - called"); + try { + thread = threadPool.scheduleAtFixedRate(calculation, 0, calculationFrequencyMS, TimeUnit.MILLISECONDS); + } catch (Exception e) { + LOGGER.error("Problem executing the calculation: {}", e.getMessage(), e); + } + + } + + /** + * stop the service + */ + void stopGracefully() { + LOGGER.info("stopGracefully - called"); + calculation.stopGracefully(); + thread.cancel(true); + } + + //////////////////// + // private methods + //////////////////// + + private void initialize() { + LOGGER.debug("initialize - called"); + + // the calculation class will create any tables it needs + // here is for anything the engine needs (profiling etc) + } + +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/PersistenceException.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/PersistenceException.java new file mode 100644 index 0000000..6096ae7 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/PersistenceException.java @@ -0,0 +1,63 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample; + +public class PersistenceException +extends Exception { + public static final String CODE_AUTHENTICATION_INVALID_CREDENTIALS = "CODE_AUTHENTICATION_INVALID_CREDENTIALS"; // error + // code + // credentials + public static final String CODE_SINGLETON_NOT_FOUND = "CODE_SINGLETON_NOT_FOUND"; // error code missing singleton + private static final long serialVersionUID = -1230356990632230194L; // autogenerated id + private final String code; // current code + + // constructor + public PersistenceException(String code) { + super(); + this.code = code; + } + + /** + * @param message + * message of the exception + * @param cause + * cause of the exception + * @param code + * code of the exception + */ + public PersistenceException(String message, Throwable cause, String code) { + super(message, cause); + this.code = code; + } + + /** + * @param message + * message of the exception + * @param code + * code of the exception + */ + public PersistenceException(String message, String code) { + super(message); + this.code = code; + } + + /** + * @param cause + * cause of the exception + * @param code + * code of the exception + */ + public PersistenceException(Throwable cause, String code) { + super(cause); + this.code = code; + } + + /** + * @return the code + */ + public String getCode() { + return this.code; + } +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/PredictiveModuleActivator.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/PredictiveModuleActivator.java new file mode 100644 index 0000000..03d5262 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/PredictiveModuleActivator.java @@ -0,0 +1,286 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample; + +import java.io.File; +import java.io.IOException; +import java.nio.charset.StandardCharsets; +import java.nio.file.Files; +import java.util.Dictionary; +import java.util.Hashtable; +import java.util.Optional; + +import org.apache.commons.lang.StringUtils; +import org.osgi.framework.BundleActivator; +import org.osgi.framework.BundleContext; +import org.osgi.framework.ServiceEvent; +import org.osgi.framework.ServiceListener; +import org.osgi.service.component.annotations.Activate; +import org.osgi.service.component.annotations.Component; +import org.osgi.service.component.annotations.Deactivate; +import org.osgi.service.component.annotations.Reference; +import org.osgi.service.component.annotations.ReferenceCardinality; +import org.osgi.service.component.annotations.ReferencePolicy; +import org.osgi.service.event.Event; +import org.osgi.service.event.EventConstants; +import org.osgi.service.event.EventHandler; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.sap.iot.edgeservices.configservice.service.IConfigStatusService; +import com.sap.iot.edgeservices.persistenceservice.service.IPersistenceService; +import com.sap.iot.edgeservices.predictive.sample.custom.PredictValues; +import com.sap.iot.edgeservices.predictive.sample.db.PersistenceClient; +import com.sap.iot.edgeservices.predictive.sample.proxies.ConfigurationFields; +import com.sap.iot.edgeservices.predictive.sample.proxies.CustomConfiguration; +import com.sap.iot.edgeservices.predictive.sample.utilities.ConfigurationHandler; + +/** + * This class is the entry point of the OSGi Bundle + * + * It also creates the Engine which is on a timer thread. The Engine is responsible for executing the Calculation. + * + * The Calculation is an interface, where the implementing class has the actual logic that is executed. In this example, + * the CalculateAverages class implements Calculation, and is executed by the Engine every 5 seconds. + * + * The PersistenceClient provides access to the Persistence Service. + * + */ +@Component(immediate = true) +public class PredictiveModuleActivator +implements BundleActivator, ServiceListener, EventHandler { + + private static final Logger LOGGER = LoggerFactory.getLogger(PredictiveModuleActivator.class); // logger + private static final String EVENT_TOPIC = "RGBSERVICE"; // The Event Admin topic to subscribe to for config + + //////////////////// + // class fields + //////////////////// + private static volatile CustomConfiguration configuration; // Custom configuration object dynamically loaded + private static IConfigStatusService configStatusService; // Reference for the configuration service + private static IPersistenceService service; // handle to the Persistence Service which this sample depends on + private static BundleContext bundleContext; // bundle context of this bundle + private static Engine engine; // custom class that executes logic on a timer + private static PersistenceClient persistenceClient; // helper class to access the Persistence Service + private static volatile String lastSuccessfulFingerprint; // fingerprint of the configuration + // private IPersistenceService persistenceService; //persistence service ref + + /** + * remove persistence reference + */ + private static void removePersistenceReference() { + PredictiveModuleActivator.service = null; + } + + /** + * @param bundleContext + * bundle context + */ + private static void setBundleContext(BundleContext bundleContext) { + PredictiveModuleActivator.bundleContext = bundleContext; + } + + public static String getLastSuccessfulFingerprint() { + return lastSuccessfulFingerprint; + } + + /** + * @param lastSuccessfulFingerprint + * last fingerprint + */ + private static void setLastSuccessfulFingerprint(String lastSuccessfulFingerprint) { + PredictiveModuleActivator.lastSuccessfulFingerprint = lastSuccessfulFingerprint; + } + + public static CustomConfiguration getConfiguration() { + return configuration; + } + + /** + * @param conf + * currenct active configuration + */ + private static void setConfiguration(CustomConfiguration conf) { + PredictiveModuleActivator.configuration = conf; + } + + /** + * initialize the persistence client + */ + private static void initPersistenceService() { + LOGGER.debug("---- PersistenceSampleActivator.setPersistenceService"); + LOGGER.debug("---- service = {}", PredictiveModuleActivator.service); + LOGGER.debug("---- context = {}", PredictiveModuleActivator.bundleContext); + + try { + PredictiveModuleActivator.persistenceClient = new PersistenceClient(PredictiveModuleActivator.service, + PredictiveModuleActivator.bundleContext); + } catch (PersistenceException e) { + LOGGER.error("Could not get token for database. Engine not started due to {}", e.getMessage(), e); + LOGGER.error("Persistence sample is not running."); + return; + } + PredictValues predictValues = new PredictValues(PredictiveModuleActivator.persistenceClient); + PredictiveModuleActivator.engine = new Engine(predictValues, configuration.getAnalysisFrequency()); + PredictiveModuleActivator.engine.start(); + } + + /** + * initialize the configuration object + */ + private static void initConfiguration() { + LOGGER.debug("Configuration is using topic: {}", EVENT_TOPIC); + // load configuration from file or use a default configuration + CustomConfiguration defaultConfig = ConfigurationHandler.loadDefaultConfiguration(); + setConfiguration(ConfigurationHandler.loadConfigurationFromDisk(defaultConfig, EVENT_TOPIC)); + setLastSuccessfulFingerprint(ConfigurationHandler.getLastFingerprint()); + // fallback to default + if (configuration == null) { + LOGGER.debug("Starting with default configuration"); + setConfiguration(defaultConfig); + setLastSuccessfulFingerprint(null); + } + } + + //////////////////// + // public methods + //////////////////// + /* + * this function is called by OSGi when the bundle loads and starts + */ + @Activate + public void start(BundleContext bundleContext) + throws Exception { + LOGGER.debug("---- PersistenceSampleActivator.start"); + Dictionary properties = new Hashtable<>(); // NOSONAR + // Register this class to listen over Event Admin for activation requests with the topic EVENT_TOPIC + properties.put(EventConstants.EVENT_TOPIC, EVENT_TOPIC); + bundleContext.registerService(EventHandler.class, this, properties); + PredictiveModuleActivator.setBundleContext(bundleContext); + // init configuration + initConfiguration(); + // init persistence + initPersistenceService(); + LOGGER.info("---- {} initialization success", this.getClass()); + + } + + /* + * (non-Javadoc) + * + * @see org.osgi.framework.BundleActivator#stop(org.osgi.framework.BundleContext) + */ + @Deactivate + public void stop(BundleContext context) + throws Exception { + LOGGER.debug("---- PersistenceSampleActivator.stop"); + PredictiveModuleActivator.engine.stopGracefully(); + PredictiveModuleActivator.removePersistenceReference(); + PredictiveModuleActivator.setBundleContext(null); + } + + /** + * If the Persistence Service changes (the underlying bundle swaps out the implementation) then we could reconnect + * without change. This is beyond the scope of this sample. + */ + @Override + public void serviceChanged(ServiceEvent arg0) { + LOGGER.debug("---- PersistenceSample:PersistenceSampleActivator.serviceChanged - no operation performed."); + } + + /** + * @param event + * handle the event to get new configurations + */ + @Override + public void handleEvent(Event event) { + // Check to see if the event received conforms to a config activation event + // i.e. the event contains the config file to be activated and its associated fingerprint + if (event.getProperty(ConfigurationFields.configFile.name()) instanceof File && + event.getProperty(ConfigurationFields.configFingerprint.name()) instanceof String) { + File configFile = (File) event.getProperty(ConfigurationFields.configFile.name()); + String fingerprint = (String) event.getProperty(ConfigurationFields.configFingerprint.name()); + + // Return if the sent config file has already been activated + if (!StringUtils.isEmpty(lastSuccessfulFingerprint) && lastSuccessfulFingerprint.equals(fingerprint)) { + return; + } + + getConfigStatusService().ifPresent(cfgStatusService -> { + try { + String configFileContents = new String(Files.readAllBytes(configFile.toPath()), + StandardCharsets.UTF_8); + LOGGER.info("Config File Contents:\n{}", configFileContents); + // Set the lastSuccessfulFingerprint to this config file's fingerprint if the config file was + // successfully activated + // Call the activationStatus Declarative Service with the activation result (true or false), + // fingerprint, and a status message + if (ConfigurationHandler.writeConfigurationToDisk(EVENT_TOPIC, configFileContents, + fingerprint) != null) { + setLastSuccessfulFingerprint(fingerprint); + cfgStatusService.activationStatus(true, fingerprint, "Activation Succeeded"); + } else { + cfgStatusService.activationStatus(false, fingerprint, "Activation Failed"); + } + } catch (IOException e) { + LOGGER.error("Cannot read config file: {}", e.getMessage(), e); + cfgStatusService.activationStatus(false, fingerprint, "Cannot read config file: " + e.getMessage()); + } + }); + } + } + + /* + * When the Persistence Service is running and available, OSGi framework will call this function passing in the + * handle to the Persistence Service. + * + * This is considered to be the start of the OSGi bundle since we are only waiting on this service before it can + * start functioning. + */ + @Reference(service = IPersistenceService.class, cardinality = ReferenceCardinality.MANDATORY, policy = ReferencePolicy.STATIC) + public synchronized void setPersistenceService(IPersistenceService serviceRef) { + PredictiveModuleActivator.service = serviceRef; + } + + /** + * If this Persistence Service shuts down, then this function will be called. Then engine will be stopped. + * + * @param service + * Persistence service instance + */ + public synchronized void unsetPersistenceService(IPersistenceService service) { + LOGGER.debug("---- PersistenceSample:PersistenceSampleActivator.unsetPersistenceService"); + if (PredictiveModuleActivator.service == service) { + PredictiveModuleActivator.service = null; + PredictiveModuleActivator.engine.stopGracefully(); + } + } + + /** + * @param arg + * remove the reference for the configuration status object + */ + void unsetConfigStatusService(IConfigStatusService arg) { + if (configStatusService == arg) { + configStatusService = null; // NOSONAR + } + } + + /** + * @return configuration status object + */ + private Optional getConfigStatusService() { + return Optional.ofNullable(configStatusService); + } + + /** + * @param arg + * inject configuration status object + */ + @Reference(service = IConfigStatusService.class, cardinality = ReferenceCardinality.MANDATORY, policy = ReferencePolicy.STATIC) + void setConfigStatusService(IConfigStatusService arg) { + configStatusService = arg; // NOSONAR + } +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/PredictValues.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/PredictValues.java new file mode 100644 index 0000000..35bbd3d --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/PredictValues.java @@ -0,0 +1,376 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.custom; + +import java.io.ByteArrayInputStream; +import java.io.InputStream; +import java.nio.charset.Charset; +import java.util.HashMap; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.Map; +import java.util.concurrent.atomic.AtomicReference; + +import org.apache.commons.lang.StringUtils; +import org.dmg.pmml.FieldName; +import org.dmg.pmml.PMML; +import org.jpmml.evaluator.Classification; +import org.jpmml.evaluator.Evaluator; +import org.jpmml.evaluator.EvaluatorUtil; +import org.jpmml.evaluator.FieldValue; +import org.jpmml.evaluator.InputField; +import org.jpmml.evaluator.OutputField; +import org.jpmml.evaluator.TargetField; +import org.jpmml.evaluator.ValueMap; +import org.jpmml.evaluator.nearest_neighbor.NearestNeighborModelEvaluator; +import org.jpmml.model.PMMLUtil; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.sap.iot.edgeservices.persistenceservice.model.PSStatementObject; +import com.sap.iot.edgeservices.persistenceservice.model.QueryInputList; +import com.sap.iot.edgeservices.predictive.sample.Calculation; +import com.sap.iot.edgeservices.predictive.sample.PredictiveModuleActivator; +import com.sap.iot.edgeservices.predictive.sample.db.PersistenceClient; +import com.sap.iot.edgeservices.predictive.sample.proxies.CustomConfiguration; +import com.sap.iot.edgeservices.predictive.sample.proxies.DataCollection; +import com.sap.iot.edgeservices.predictive.sample.proxies.MultidimensionalValue; +import com.sap.iot.edgeservices.predictive.sample.utilities.DataStreamer; + +public class PredictValues +extends Calculation { + + //////////////////// + // Static fields + //////////////////// + private static final Logger LOGGER = LoggerFactory.getLogger(PredictValues.class); // logger + private static final Boolean CLOUD_EDGE_SERVICES = true; // SET TO false for ON-PREMISE + + //////////////////// + // Class fields + //////////////////// + private String lastConfigurationFingerprint; // fingerprint of last configuration + private CustomConfiguration configuration; // parameters configuration + private String sensorTypeAlternateId; // the sensorTypeAltnerateId that the engine will calculate on + + private QueryData queryData; // helper object to make persistence queries + private String mostRecentQueryTime; // last query timestamp + private Evaluator evaluator; // pmml evaluator object + private PMML pmml; // pmml object + + //////////////////// + // constructors + //////////////////// + + public PredictValues(PersistenceClient persistenceClient) { + super(persistenceClient); + + LOGGER.debug("PredictValues:ctor - called"); + // propagate teh parameter + QueryData.setCloudEdgeServices(CLOUD_EDGE_SERVICES); + + // properties that control how the Calculation will be done + if (CLOUD_EDGE_SERVICES) { + // cloud edition + sensorTypeAlternateId = "*"; // only for this sensorType + } else { + // on-premise edition + sensorTypeAlternateId = "color"; // only for this Sensor Profile + } + + mostRecentQueryTime = queryData.resetQueryTime(); + } + + /** + * Load a PMML model from the file system. + * + * @param file + * PMML model file + * @return PMML object + */ + private static PMML loadModel(final InputStream file) { + PMML pmml = null; + + try (InputStream in = file) { + pmml = PMMLUtil.unmarshal(in); + + } catch (Exception e) { + LOGGER.error(e.toString(), e); + } + return pmml; + } + + //////////////////// + // public methods + //////////////////// + + /** + * @param pmml + * PMML object + * @return the object to evaluate over the PMML model + */ + private static Evaluator evaluatorModel(PMML pmml) { + Evaluator evaluator = new NearestNeighborModelEvaluator(pmml); + // Perforing the self-check + evaluator.verify(); + // Printing input (x1, x2, .., xn) fields + List inputFields = evaluator.getInputFields(); + LOGGER.debug("Input fields: {}", inputFields); + + // Printing primary result (y) field(s) + List targetFields = evaluator.getTargetFields(); + LOGGER.debug("Target field(s): {}", targetFields); + + // Printing secondary result (eg. probability(y), decision(y)) fields + List outputFields = evaluator.getOutputFields(); + LOGGER.debug("Output fields: {}", outputFields); + + return evaluator; + } + + /** + * @param evaluator + * the model evaluator + * @param inputRecord + * the data (input) + * @return the predicted value + */ + private static Map evaluateModel(Evaluator evaluator, Map inputRecord) { + // Get the list of required feature set model needs to predict. + List inputFields = evaluator.getInputFields(); + List outputFields = evaluator.getOutputFields(); + List target = evaluator.getTargetFields(); + + if (inputRecord == null || target.isEmpty()) { + return null; + } + + Map arguments = new LinkedHashMap<>(); + + // Mapping the record field-by-field from data source schema to PMML schema + for (InputField inputField : inputFields) { + FieldName inputName = inputField.getName(); + // Get the raw value + Object rawValue = inputRecord.get(inputName.getValue()); + + // Transforming an arbitrary user-supplied value to a known-good PMML value + FieldValue inputValue = inputField.prepare(rawValue); + + arguments.put(inputName, inputValue); + } + Map results = null; + try { + // Evaluating the model with known-good arguments + results = evaluator.evaluate(arguments); + } catch (Exception e) { + LOGGER.error(e.getMessage(), e); + } + + if (results == null) { + LOGGER.warn("No result"); + return null; + } + // Decoupling results from the JPMML-Evaluator runtime environment + Map resultRecord = EvaluatorUtil.decodeAll(results); + + FieldName targetField = target.get(0).getFieldName(); + Classification classified = (Classification) results.get(targetField); + ValueMap distances = classified.getValues(); + + // put the distance instead of the indexes of the neighbors + Map predictionAndDistance = new HashMap<>(resultRecord); + for (OutputField outputField : outputFields) { + FieldName outputName = outputField.getName(); + // get the raw value + Object rawValue = resultRecord.get(outputName.getValue()); + String i; + try { + i = (String) rawValue; + // replace with the distance + predictionAndDistance.put(outputName.getValue(), distances.get(i)); + } catch (Exception e) { + LOGGER.error(e.getMessage(), e); + } + } + return predictionAndDistance; + } + + @Override + public void stopGracefully() { + LOGGER.debug("Invoked service STOP"); + } + + /** + * automatically invoked each thread run + */ + @Override + public void run() { + LOGGER.debug("Invoked PredictValues thread"); + + // ensure we have initialized and in a good state + if (state != State.RUNNING) { + LOGGER.error("NOT RUNNING: PredictValues.state = {}", state); + return; + } + + // determine if any configuration changes have been sent + updateConfigurations(); + + // get the data and create a primitive array + Map> valuesByDevice = getSampleData(); + + // for each device that sent in a value, send out the max + valuesByDevice.forEach((device, values) -> { + LOGGER.debug("======================== Calculating prediction"); + + Float validPrediction = null; + // evaluate each measure + for (MultidimensionalValue rgbmap : values.getMeasures()) { + Map prediction = evaluateModel(evaluator, rgbmap.getMeasures()); + // build an overall index + validPrediction = checkPrediction(prediction, validPrediction, evaluator.getOutputFields()); + + // send the results back into IOT Service engine as a different capability + DataStreamer.streamResults(CLOUD_EDGE_SERVICES, configuration.getEdgePlatformRestEndpoint(), device, + configuration.getPredictionSensorTypeAlternateId(), + configuration.getPredictionCapabilityAlternateId(), configuration.getPredictionSensorAlternateId(), + prediction); + + } + + // send the final result back into IOT Service engine as a different capability + DataStreamer.streamResult(CLOUD_EDGE_SERVICES, configuration.getEdgePlatformRestEndpoint(), device, + configuration.getPredictionSensorTypeAlternateId(), + configuration.getPredictionIndexCapabilityAlternateId(), configuration.getPredictionSensorAlternateId(), + validPrediction); + }); + } + + /** + * @param prediction + * the predicted value + * @param validPrediction + * valid prediction index + * @param outputFields + * pmml model output fields + * @return the updated index + */ + private Float checkPrediction(Map prediction, Float validPrediction, List outputFields) { + AtomicReference distance = new AtomicReference<>(0f); + // each output field contribute to the overall index + outputFields.forEach(field -> { + // define an unacceptable value + float unacceptableThreshold = 2 * outputFields.size() * configuration.getPlantColorOutOfRangeLimit(); + float val = unacceptableThreshold; + try { + val = Float.parseFloat(String.valueOf(prediction.get(field.getName().getValue()))); + // check distance out of range + } catch (Exception e) { + LOGGER.debug("Unable to get predicted value (error: {})", e.getMessage(), e); + // put an unacceptable value + distance.updateAndGet(v -> v + unacceptableThreshold); + } + + if (val > configuration.getPlantColorOutOfRangeLimit()) { + // apply a malus for the aggregated index + val *= configuration.getPlantScalingForOutOfRange(); + } + float finalVal = val; + // increment the index + distance.updateAndGet(v -> v + finalVal); + }); + if (validPrediction == null) { + // return the average + return distance.get() / outputFields.size(); + } + // was not in range + if ((distance.get() / outputFields.size()) > configuration.getPlantColorOutOfRangeLimit() || + validPrediction > configuration.getPlantColorOutOfRangeLimit()) { + // return the worst value + return (distance.get() / outputFields.size()) > validPrediction ? (distance.get() / outputFields.size()) + : validPrediction; + } + // average of the average + return ((distance.get() / outputFields.size()) + validPrediction) / 2; + } + + //////////////////// + // private methods + //////////////////// + + // this is called by the super class, Calculation + protected void initialize() { + // init auxiliary classes + queryData = new QueryData(persistenceClient); + // first update of the configuration + updateConfigurations(); + // Load the model + InputStream defaultInputStream = getClass().getClassLoader().getResourceAsStream("knn-color-model.pmml"); + PMML pmmlUpdate = null; + // read pmml from configuration, if possible + String pmmlString = configuration.getPmmlFileContentAsString(); + if (!StringUtils.isEmpty(pmmlString)) { + LOGGER.info("Converting custom PMML configuration"); + try { + byte[] stringPmml = configuration.getPmmlFileContentAsString().getBytes(Charset.defaultCharset()); + pmmlUpdate = loadModel(new ByteArrayInputStream(stringPmml)); + } catch (Exception e) { + LOGGER.error(e.getMessage(), e); + } + } + if (pmmlUpdate == null) { + LOGGER.info("Fallback to the default PMML configuration"); + try { + this.pmml = loadModel(defaultInputStream); + } catch (Exception e) { + LOGGER.error(e.getMessage(), e); + } + } else { + this.pmml = pmmlUpdate; + } + // load evaluator + this.evaluator = evaluatorModel(pmml); + LOGGER.debug("PredictValues:initialize - called"); + // if you want to store the result to a custom table, create a table to store them. + this.state = State.RUNNING; + } + + /** + * selects the sample data from the persistence database which is constantly being updated + * + * @return data collection per each device + */ + private Map> getSampleData() { + PSStatementObject stmt; + // build sql expression to get data + String sql = queryData.getSqlForMeasureValues(); + QueryInputList args = queryData.getSqlArgsForMeasureValues(sensorTypeAlternateId, + configuration.getCapabilityAlternateId(), mostRecentQueryTime); + // update the timestamp for the next run + mostRecentQueryTime = queryData.resetQueryTime(); + + Map> valuesByDevice = new HashMap<>(); + try { + // convert measure to a structured object + stmt = persistenceClient.executeQuery(sql, args); + valuesByDevice = queryData.getValuesAsFloatMapsByDevice(stmt, sensorTypeAlternateId, + configuration.getCapabilityAlternateId()); + } catch (Exception e) { + LOGGER.error(e.getMessage(), e); + } + return valuesByDevice; + } + + // get the update of the configuration + private void updateConfigurations() { + String fingerprint = PredictiveModuleActivator.getLastSuccessfulFingerprint(); + if (configuration == null || (lastConfigurationFingerprint != null && fingerprint != null && + !fingerprint.contentEquals(lastConfigurationFingerprint))) { + configuration = PredictiveModuleActivator.getConfiguration(); + lastConfigurationFingerprint = fingerprint; + } + } + +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/QueryData.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/QueryData.java new file mode 100644 index 0000000..b374eb5 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/QueryData.java @@ -0,0 +1,254 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.custom; + +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.sap.iot.edgeservices.persistenceservice.enums.QueryInputType; +import com.sap.iot.edgeservices.persistenceservice.model.PSStatementObject; +import com.sap.iot.edgeservices.persistenceservice.model.QueryInputItem; +import com.sap.iot.edgeservices.persistenceservice.model.QueryInputList; +import com.sap.iot.edgeservices.predictive.sample.PersistenceException; +import com.sap.iot.edgeservices.predictive.sample.db.PersistenceClient; +import com.sap.iot.edgeservices.predictive.sample.proxies.DataCollection; +import com.sap.iot.edgeservices.predictive.sample.proxies.MultidimensionalValue; + +public class QueryData { + private static final Logger LOGGER = LoggerFactory.getLogger(QueryData.class); // logger + private static Boolean cloudEdgeServices; // flag on premise edition + private final PersistenceClient persistenceClient; // current persistence reference + + // constructor + QueryData(PersistenceClient persistenceClient) { + this.persistenceClient = persistenceClient; + } + + // setter + static void setCloudEdgeServices(Boolean cloudEdgeServices) { + QueryData.cloudEdgeServices = cloudEdgeServices; + } + + // reset the query time for the next query + String resetQueryTime() { + String mostRecentQueryTime = null; + try { + mostRecentQueryTime = persistenceClient + .getFirstRowFirstColumn(persistenceClient.executeQuery("SELECT NOW()")); + LOGGER.debug("new date is: {}", mostRecentQueryTime); + } catch (PersistenceException e1) { + LOGGER.error("Unable to update the time: {}", e1.getMessage(), e1); + } + return mostRecentQueryTime; + } + + /** + * @return the sql expression + */ + private String getSqlForMetadata() { + // NOTE: only top 1000 records are returned. If more data is expected, then + // query should be changed to use database aggregation instead of Java + String sql = "SELECT top 1000 m.PROP_ID, m.PROP_SEQ, m.TYPE_ID FROM EFPS.MEASURE_TYPE_PROPERTY m " + + " WHERE m.OBJECT_ID = ?"; + // Add PROFILE_ID only in case of OP Edition + if (!cloudEdgeServices) { + sql += " AND m.PROFILE_ID = ?"; + } + sql += " ORDER BY m.PROP_SEQ ASC"; + + LOGGER.debug("getSqlForMetadata: ============"); + LOGGER.debug(sql); + + return sql; + } + + /** + * @param profileId + * use a particular profile for the on-premise + * @param objectId + * capabilityAlternateId + * @return the parameters for the query + */ + private QueryInputList getSqlArgsForMetadata(String profileId, String objectId) { + // create list of parameters + List items = new ArrayList<>(); + items.add(new QueryInputItem(objectId, QueryInputType.String)); + if (!cloudEdgeServices) { + items.add(new QueryInputItem(profileId, QueryInputType.String)); + } + return new QueryInputList(items); + } + + /** + * @return the sql expression + */ + String getSqlForMeasureValues() { + // NOTE: only top 1000 records are returned. If more data is expected, then + // query should be changed to use database aggregation instead of Java + String sql = "SELECT top 1000 m.DEVICE_ADDRESS, CAST(m.MEASURE_VALUE AS VARCHAR(32)) MEASURE_VALUE, " + + " m.DATE_RECEIVED FROM EFPS.MEASURE m WHERE m.OBJECT_ID = ? AND m.DATE_RECEIVED > ?"; + // Add PROFILE_ID only in case of OP Edition + if (!cloudEdgeServices) { + sql += " AND m.PROFILE_ID = ?"; + } + sql += " ORDER BY m.DATE_RECEIVED DESC"; + + LOGGER.debug("getSqlForMeasureValues: ============"); + LOGGER.debug(sql); + + return sql; + } + + /** + * @param profileId + * use a particular profile for the on-premise + * @param objectId + * capabilityAlternateId + * @param sinceDate + * date parameter for the query + * @return the parameters for the query + */ + QueryInputList getSqlArgsForMeasureValues(String profileId, String objectId, String sinceDate) { + // create list of parameters + List items = new ArrayList<>(); + items.add(new QueryInputItem(objectId, QueryInputType.String)); + items.add(new QueryInputItem(sinceDate, QueryInputType.String)); + if (!cloudEdgeServices) { + items.add(new QueryInputItem(profileId, QueryInputType.String)); + } + return new QueryInputList(items); + } + + /** + * @param statementObject + * the resultset + * @param sensorTypeAlternateId + * the sensor type alternate id + * @param capabilityAlternateId + * the capability type alternate id + * @return a collection with all the measurements around all the devices + */ + Map> getValuesAsFloatMapsByDevice(PSStatementObject statementObject, + String sensorTypeAlternateId, String capabilityAlternateId) { + List properties = this.getMetadata(sensorTypeAlternateId, capabilityAlternateId); + Map> valuesByDevice = new HashMap<>(); + + LOGGER.debug("getValuesAsDoublesByDevice start-------------"); + if (!statementObject.hasResultList()) { + // no values + LOGGER.debug("ResultSet is empty"); + return valuesByDevice; + } + // for each result convert to data + statementObject.getResultList().forEach(row -> { + // result zero is the device + String device = row.get(0).getValue().toString(); + LOGGER.debug("device = {}", device); + DataCollection valueMap = valuesByDevice.get(device); + // create aan entry in the map for each device + if (valueMap == null) { + valueMap = new DataCollection<>(); + valuesByDevice.put(device, valueMap); + } + // result one is the value + String values = row.get(1).getValue().toString(); + LOGGER.debug("value = {}", values); + LOGGER.debug("{}:{}", device, values); + + // result two is the timestamp + String date = row.get(2).getValue().toString(); + LOGGER.debug("date = {}", date); + + MultidimensionalValue mapValues = extractFloatProperties(values, properties); + // add al the properties in the measurement collection object + valueMap.add(mapValues); + LOGGER.debug("value added"); + }); + + LOGGER.debug("getValuesAsDoublesByDevice end---------------"); + return valuesByDevice; + } + + /** + * @param values + * value to be parsed + * @param properties + * properties names + * @return the collection of properties / values + */ + private MultidimensionalValue extractFloatProperties(String values, List properties) { + MultidimensionalValue mapValues = new MultidimensionalValue<>(); + // Split into properties + if (StringUtils.isEmpty(values)) { + LOGGER.debug("Empty values"); + return mapValues; + } + String[] valuesArray = values.split(" "); + for (int i = 0; i < valuesArray.length; i++) { + String prop = properties.get(i); + // extract as float + try { + Float f = Float.valueOf(valuesArray[i]); + mapValues.put(prop, f); + } catch (NumberFormatException nfe) { + LOGGER.debug("Unable to parse the value: {} due to {}", values, nfe.getMessage(), nfe); + } + } + return mapValues; + } + + /** + * @param statementObject + * the resultset + * @return a list of string that are the metadata + */ + private List getMetadataFromResultset(PSStatementObject statementObject) { + List metadata = new ArrayList<>(); + LOGGER.debug("getMetadataFromResultset start-------------"); + if (statementObject.hasResultList()) { + statementObject.getResultList().forEach(row -> { + // value zero contains metadata + String type = row.get(0).getValue().toString(); + metadata.add(type); + LOGGER.debug("type = {}", type); + }); + } else { + LOGGER.debug(" ResultSet is empty"); + } + + LOGGER.debug("getMetadataFromResultset end---------------"); + return metadata; + } + + /** + * @param sensorTypeAlternateId + * the sensor type alternate id + * @param capabilityAlternateId + * the capability type alternate id + * @return a list of string that are the metadata + */ + private List getMetadata(String sensorTypeAlternateId, String capabilityAlternateId) { + PSStatementObject stmt; + // build sql to get metadata + String sql = getSqlForMetadata(); + QueryInputList args = getSqlArgsForMetadata(sensorTypeAlternateId, capabilityAlternateId); + + List types = new ArrayList<>(); + try { + stmt = persistenceClient.executeQuery(sql, args); + // convert raw data + types = getMetadataFromResultset(stmt); + } catch (Exception e) { + LOGGER.error("Unable to get metadata due to: {}", e.getMessage(), e); + } + return types; + } +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/db/PersistenceClient.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/db/PersistenceClient.java new file mode 100644 index 0000000..8bd0513 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/db/PersistenceClient.java @@ -0,0 +1,168 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.db; + +import java.util.List; + +import org.osgi.framework.BundleContext; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.sap.iot.edgeservices.persistenceservice.model.PSDataObject; +import com.sap.iot.edgeservices.persistenceservice.model.PSStatementObject; +import com.sap.iot.edgeservices.persistenceservice.model.QueryInputList; +import com.sap.iot.edgeservices.persistenceservice.service.IPersistenceService; +import com.sap.iot.edgeservices.predictive.sample.PersistenceException; + +/** + * This is a helper class that allows the Calculation classes to connect to the database. + * + */ +public class PersistenceClient { + + private static final Logger LOGGER = LoggerFactory.getLogger(PersistenceClient.class); // logger + //////////////////// + // fields + //////////////////// + + private static final String AUTH_CREDENTIALS = "G3vpZHTbKbYH^8}"; // an example password - must be securely stored + // to + // authenticate + private final IPersistenceService persistenceService; + private final BundleContext bundleContext; + private final String token; + + //////////////////// + // Constructors + //////////////////// + + public PersistenceClient(IPersistenceService persistenceService, BundleContext bundleContext) + throws PersistenceException { + this.persistenceService = persistenceService; + this.bundleContext = bundleContext; + // we attempt the token access here as if this fails, then the engine should not run + this.token = this.getPersistenceAccessToken(); + } + + //////////////////// + // public methods + //////////////////// + + /** + * This function will get the token from the persistence persistenceService + * + * @return token + * @throws PersistenceException + * if the credentials are invalid, returns type: + * PersistenceException.CODE_AUTHENTICATION_INVALID_CREDENTIALS + */ + private String getPersistenceAccessToken() + throws PersistenceException { + if (this.token == null) { + + String newToken; + LOGGER.debug("(HIDE IN PRODUCTION) bundleCanonicalName = {}", getPersistenceUsername()); + LOGGER.debug("(HIDE IN PRODUCTION) password = {}", AUTH_CREDENTIALS); + + newToken = persistenceService.RegisterBundleForAccess(getPersistenceUsername(), + AUTH_CREDENTIALS.toCharArray()); + if (newToken == null) { + throw new PersistenceException(PersistenceException.CODE_AUTHENTICATION_INVALID_CREDENTIALS); + } + return newToken; + } else { + return this.token; + } + } + + /** + * Executes DML against the persistence persistenceService DML is data modeling language, execute queries, updates, + * delete of data + * + * @param sql + * the query/update/delete to execute + * @param parameters + * the parameters for the query + * @return a statement Object of the result set or rows changed. + */ + public PSStatementObject executeQuery(String sql, QueryInputList parameters) { + return persistenceService.executeSQL(token, sql, parameters); + } + + /** + * Executes DML against the persistence persistenceService DML is data modeling language, execute queries, updates, + * delete of data. No variable parameters are allowed. + * + * @param sql + * the query/update/delete to execute + * @return a statement Object of the result set or rows changed. + */ + public PSStatementObject executeQuery(String sql) { + return persistenceService.ExecuteSQL(token, sql); + } + + /** + * Primarily for queries that will return just a single value, this function will extract that value + * + * @param statementObject + * a PSStatementObject with a result set + * @return the string value of the first column of the first row of the result set + * @throws PersistenceException + * if there is no result set then this will throw PersistenceException.CODE_SINGLETON_NOT_FOUND + */ + public String getFirstRowFirstColumn(PSStatementObject statementObject) + throws PersistenceException { + return getValue(statementObject, 0, 0); + } + + /** + * Return a specific column and row value + * + * @param statementObject + * a PSStatementObject with a result set + * @param row + * 0-based index + * @param column + * 0-based index + * @return the string value of the column and row of the result set + * @throws PersistenceException + * if there is no result set then this will throw PersistenceException.CODE_SINGLETON_NOT_FOUND + */ + private String getValue(PSStatementObject statementObject, int row, int column) + throws PersistenceException { + if (statementObject.hasResultList() && !statementObject.getResultList().isEmpty()) { + List> rows = statementObject.getResultList(); + List columns = rows.get(row); + if (!columns.isEmpty()) { + return columns.get(column).getValue().toString(); + } + } + throw new PersistenceException(PersistenceException.CODE_SINGLETON_NOT_FOUND); + } + + //////////////////// + // private methods + //////////////////// + + private String getPersistenceUsername() { + LOGGER.debug(" getPersistenceUsername: bundleContext: {}", this.bundleContext); + // adding a version number so that in development registering can be done + // without a new database since currently there is no way to delete or update + // a bundle/password combo + return getBundleCanonicalName(this.bundleContext) + ".v1"; + } + + /** + * return this bundle's name + * + * @return a string that represent the name of the bundle + */ + private String getBundleCanonicalName(BundleContext bundleContext) { + String bundleCanonicalName = bundleContext.getBundle().getSymbolicName(); + LOGGER.info("Bundle started with name: {}", bundleCanonicalName); + return bundleCanonicalName; + } + +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/ConfigurationFields.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/ConfigurationFields.java new file mode 100644 index 0000000..bbacf3b --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/ConfigurationFields.java @@ -0,0 +1,13 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.proxies; + +/** + * Configuration fields (policy service) + */ +public enum ConfigurationFields { + configFile, // NOSONAR + configFingerprint // NOSONAR +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/CustomConfiguration.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/CustomConfiguration.java new file mode 100644 index 0000000..a87ce03 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/CustomConfiguration.java @@ -0,0 +1,107 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.proxies; + +import org.apache.commons.lang.StringUtils; + +public class CustomConfiguration { + private String predictionSensorTypeAlternateId; // target sensortype alternate id + private String capabilityAlternateId; // received capability alternate id + private String predictionSensorAlternateId; // target sensortype alternate id + private String predictionCapabilityAlternateId; // target capability alternate id + private String predictionIndexCapabilityAlternateId; // target index capability alternate id + private String edgePlatformRestEndpoint; // endpoint of the edge platform rest to ingest data + private Float plantColorOutOfRangeLimit; // maximum allowed distance, after you will have an outlayer + private Float plantScalingForOutOfRange; // weight for the outlayers used in the computation of the indexes + private Long analysisFrequency; // frequency to invoke the persistence client to fetch measures + private String pmmlFileContentAsString; // pmml model file converted to an escaped string + + // default constructor + public CustomConfiguration() { + super(); + } + + /** + * Populate the missing objects + * + * @param defaultConfiguration + * default configuration object (no null inside) + */ + public void mergeMissingValues(CustomConfiguration defaultConfiguration) { + if (StringUtils.isEmpty(predictionSensorTypeAlternateId)) { + predictionSensorTypeAlternateId = defaultConfiguration.getPredictionSensorTypeAlternateId(); + } + if (StringUtils.isEmpty(capabilityAlternateId)) { + capabilityAlternateId = defaultConfiguration.getCapabilityAlternateId(); + } + if (StringUtils.isEmpty(predictionSensorAlternateId)) { + predictionSensorAlternateId = defaultConfiguration.getPredictionSensorAlternateId(); + } + if (StringUtils.isEmpty(predictionCapabilityAlternateId)) { + predictionCapabilityAlternateId = defaultConfiguration.getPredictionCapabilityAlternateId(); + } + if (StringUtils.isEmpty(predictionIndexCapabilityAlternateId)) { + predictionIndexCapabilityAlternateId = defaultConfiguration.getPredictionIndexCapabilityAlternateId(); + } + if (StringUtils.isEmpty(edgePlatformRestEndpoint)) { + edgePlatformRestEndpoint = defaultConfiguration.getEdgePlatformRestEndpoint(); + } + if (plantColorOutOfRangeLimit == null) { + plantColorOutOfRangeLimit = defaultConfiguration.getPlantColorOutOfRangeLimit(); + } + if (plantScalingForOutOfRange == null) { + plantScalingForOutOfRange = defaultConfiguration.getPlantScalingForOutOfRange(); + } + if (analysisFrequency == null) { + analysisFrequency = defaultConfiguration.getAnalysisFrequency(); + } + if (StringUtils.isEmpty(pmmlFileContentAsString)) { + pmmlFileContentAsString = defaultConfiguration.getPmmlFileContentAsString(); + } + } + + /** + * getters and setters + */ + + public Long getAnalysisFrequency() { + return analysisFrequency; + } + + public String getPredictionSensorTypeAlternateId() { + return predictionSensorTypeAlternateId; + } + + public String getCapabilityAlternateId() { + return capabilityAlternateId; + } + + public String getPredictionSensorAlternateId() { + return predictionSensorAlternateId; + } + + public String getPredictionCapabilityAlternateId() { + return predictionCapabilityAlternateId; + } + public String getPredictionIndexCapabilityAlternateId() { + return predictionIndexCapabilityAlternateId; + } + + public String getEdgePlatformRestEndpoint() { + return edgePlatformRestEndpoint; + } + + public Float getPlantColorOutOfRangeLimit() { + return plantColorOutOfRangeLimit; + } + + public Float getPlantScalingForOutOfRange() { + return plantScalingForOutOfRange; + } + + public String getPmmlFileContentAsString() { + return pmmlFileContentAsString; + } +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/DataCollection.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/DataCollection.java new file mode 100644 index 0000000..5d87c78 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/DataCollection.java @@ -0,0 +1,46 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.proxies; + +import java.util.ArrayList; +import java.util.List; + +public class DataCollection { + private List> measures = new ArrayList<>(); // list of measures + + /** + * @param index + * measure number + * @param key + * property key + * @return value + */ + public T getMeasures(int index, String key) { + if (measures == null || measures.size() < index) { + return null; + } + return measures.get(index).getMeasure(key); + } + + /** + * @param mapValues + * put a new measurement into the list + */ + public void add(MultidimensionalValue mapValues) { + measures.add(mapValues); + } + + /** + * getters and setters + */ + public List> getMeasures() { + return measures; + } + + public void setMeasures(List> measures) { + this.measures = measures; + } + +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/MultidimensionalValue.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/MultidimensionalValue.java new file mode 100644 index 0000000..4abca75 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/MultidimensionalValue.java @@ -0,0 +1,48 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.proxies; + +import java.util.HashMap; +import java.util.Map; + +import org.apache.commons.lang.StringUtils; + +public class MultidimensionalValue { + private Map measures = new HashMap<>(); // map of property value + + /** + * @param key + * property key + * @return the value + */ + public T getMeasure(String key) { + if (StringUtils.isEmpty(key)) { + return null; + } + return measures.get(key); + } + + /** + * put a value into the list + * + * @param prop + * property + * @param f + * value + */ + public void put(String prop, T f) { + measures.put(prop, f); + } + + // getters and setters + public Map getMeasures() { + return measures; + } + + public void setMeasures(Map measures) { + this.measures = measures; + } + +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/ConfigurationHandler.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/ConfigurationHandler.java new file mode 100644 index 0000000..142e6e3 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/ConfigurationHandler.java @@ -0,0 +1,190 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.utilities; + +import java.io.File; +import java.io.IOException; +import java.io.InputStream; +import java.nio.charset.Charset; +import java.nio.file.Files; +import java.nio.file.Paths; + +import org.apache.commons.io.FileUtils; +import org.apache.commons.io.IOUtils; +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.fasterxml.jackson.databind.DeserializationFeature; +import com.fasterxml.jackson.databind.ObjectMapper; +import com.sap.iot.edgeservices.predictive.sample.proxies.CustomConfiguration; + +public class ConfigurationHandler { + private static final Logger LOGGER = LoggerFactory.getLogger(ConfigurationHandler.class); // logger + private static final String BASE_PATH = "./../edgeservices/"; // base path for custom configuration, same of other + // services + private static final String UNIFORM_PATH_SEPARATOR = "/"; // linux/windows valid file separator + + private static ObjectMapper mapper = new ObjectMapper() + .configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false) + .configure(DeserializationFeature.FAIL_ON_MISSING_CREATOR_PROPERTIES, false); // json object mapper + private static String lastFingerprint = null; // last used fingerprint + + // Constructors + private ConfigurationHandler() { + super(); + } + + /** + * load an existing configuration + * + * @param serviceName + * name of current service + * @return the existing configuration (if any) + */ + public static CustomConfiguration loadConfigurationFromDisk(CustomConfiguration defaultConfiguration, + String serviceName) { + if (defaultConfiguration == null) { + defaultConfiguration = loadDefaultConfiguration(); + } + // existing file paths + String jsonFile = BASE_PATH + serviceName + "/" + serviceName + ".json"; + String fingerprintFile = BASE_PATH + serviceName + "/" + serviceName + "_fingerprint.txt"; + CustomConfiguration fromFile = null; + String content = null; + String fingerprint = null; + File path = new File(jsonFile); + if (!path.exists()) { + LOGGER.info("Configuration file does not exists: {}", jsonFile); + return null; + } + try { + byte[] contentBytes = Files.readAllBytes(Paths.get(jsonFile)); + content = new String(contentBytes, Charset.defaultCharset()); + } catch (IOException e) { + LOGGER.error("Unable to read configuration from file: {} due to {}", jsonFile, e.getMessage(), e); + } + // if there is no file there is also no needs to load the fingerprint + if (!StringUtils.isEmpty(content)) { + try { + byte[] contentBytes = Files.readAllBytes(Paths.get(fingerprintFile)); + fingerprint = new String(contentBytes, Charset.defaultCharset()); + } catch (IOException e) { + LOGGER.error("Unable to read configuration from file: {} due to {}", fingerprintFile, e.getMessage(), + e); + } + // convert to a POJO + fromFile = extractCustomConfiguration(content); + } + // populate missing values + if (fromFile != null) { + fromFile.mergeMissingValues(defaultConfiguration); + } else { + LOGGER.error("Unable to extract POJO configuration"); + return defaultConfiguration; + } + // set the fingerprint + lastFingerprint = fingerprint; + return fromFile; + } + + /** + * write a configuration into the disk + * + * @param serviceName + * the service name + * @param content + * string with the content of the configuration + * @param fingerprint + * current fingerprint + * @return the written configuration object + */ + public static CustomConfiguration writeConfigurationToDisk(String serviceName, String content, String fingerprint) { + // convert the string to a POJO + CustomConfiguration conf = extractCustomConfiguration(content); + if (conf == null) { + // configuration not valid + return null; + } + // build the path and make the dirs + String basePath = BASE_PATH + serviceName; + File path = new File(basePath); + if (!path.exists()) { + boolean created = path.mkdirs(); + if (!created) { + LOGGER.error("Unable to create the path tree: {}", basePath); + return null; + } + } + // write configuration to json file + try { + String filename = serviceName + ".json"; + File jsonFile = new File(basePath + UNIFORM_PATH_SEPARATOR + filename); + FileUtils.writeStringToFile(jsonFile, content, Charset.defaultCharset().name()); + } catch (IOException e) { + LOGGER.error("Unable to write the file: {}.json due to {}", serviceName, e.getMessage(), e); + return null; + } + // persist fingerprint + try { + String filename = serviceName + "_fingerprint.txt"; + File fingerprintFile = new File(basePath + UNIFORM_PATH_SEPARATOR + filename); + FileUtils.writeStringToFile(fingerprintFile, fingerprint, Charset.defaultCharset().name()); + } catch (IOException e) { + LOGGER.error("Unable to write the file: {}_fingerprint.txt due to {}", serviceName, e.getMessage(), e); + return null; + } + // set reference for the last fingerprint + lastFingerprint = fingerprint; + return conf; + } + + /** + * convert the configuration from string to object + * + * @param content + * json string of the configuration + * @return configuration object + */ + private static CustomConfiguration extractCustomConfiguration(String content) { + try { + return mapper.readValue(content, CustomConfiguration.class); + } catch (IOException e) { + LOGGER.error("Unable to read configuration {}", e.getMessage(), e); + } + return null; + } + + /** + * @return default configuration object + */ + public static CustomConfiguration loadDefaultConfiguration() { + // load default file from classloader + InputStream stream = ConfigurationHandler.class.getClassLoader() + .getResourceAsStream("defaultConfiguration.json"); + if (stream == null) { + LOGGER.error("No default configuration file"); + return null; + } + String content = null; + CustomConfiguration config = null; + try { + // convert the stream to a string + content = IOUtils.toString(stream, Charset.defaultCharset().name()); + } catch (IOException e) { + LOGGER.error("Unable to read configuration file {}", e.getMessage(), e); + } + // if the file is potentially valid extract the POJO + if (!StringUtils.isEmpty(content)) { + config = extractCustomConfiguration(content); + } + return config; + } + + // getters + public static String getLastFingerprint() { + return lastFingerprint; + } +} diff --git a/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/DataStreamer.java b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/DataStreamer.java new file mode 100644 index 0000000..ebaf762 --- /dev/null +++ b/predictive-pmml/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/DataStreamer.java @@ -0,0 +1,134 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.utilities; + +import java.io.OutputStream; +import java.net.HttpURLConnection; +import java.net.URL; +import java.net.URLConnection; +import java.nio.charset.StandardCharsets; +import java.util.ArrayList; +import java.util.List; +import java.util.Map; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class DataStreamer { + + private static final Logger LOGGER = LoggerFactory.getLogger(DataStreamer.class); // logger + // packet format for IOTService + private static final String IOTS_PACKET_FORMAT = "{\"sensorTypeAlternateId\":\"%s\",\"capabilityAlternateId\":\"%s\",\"sensorAlternateId\":\"%s\",\"measures\":[%s]}"; + + // constructor + private DataStreamer() { + super(); + } + + // send the results back into IOTS using a different sensorType/capability for processing again + public static void streamResults(boolean isCloudEdge, String measuresUrl, String device, + String sensorTypeAlternateId, String capabilityAlternateId, String sensorAlternateId, Map val) { + if (!isCloudEdge) { + LOGGER.debug("On-premise version is not sending data to SAP Cloud Platform Internet of Things"); + return; + } + if (val == null || val.size() == 0) { + LOGGER.error("No value to send"); + } else { + LOGGER.info("Sending data to streaming... {}{}", measuresUrl, device); + // obtain a json string + String jsonVal = mapToJsonString(val); + // format the payload + String jsonPayload = String.format(IOTS_PACKET_FORMAT, sensorTypeAlternateId, capabilityAlternateId, + sensorAlternateId, jsonVal); + + LOGGER.info("Sending data: {}", jsonPayload); + byte[] byteArrayPayload = jsonPayload.getBytes(StandardCharsets.UTF_8); + int payloadLength = byteArrayPayload.length; + + try { + // create a connection to IOTS + URL url = new URL(measuresUrl + device); + URLConnection con = url.openConnection(); + HttpURLConnection http = (HttpURLConnection) con; + + // set the properties of the post + http.setRequestMethod("POST"); // PUT is another valid option + http.setDoOutput(true); + http.setFixedLengthStreamingMode(payloadLength); + http.setRequestProperty("Content-Type", "application/json; charset=UTF-8"); + + // connect and send data + http.connect(); + try (OutputStream os = http.getOutputStream()) { + os.write(byteArrayPayload); + } + } catch (Exception e) { + LOGGER.error("Could not stream transformed results back to streaming: {}", e.getMessage(), e); + } + } + } + + // send the results back into IOTS using a different sensorType/capability for processing again + public static void streamResult(boolean isCloudEdge, String measuresUrl, String device, + String sensorTypeAlternateId, String capabilityAlternateId, String sensorAlternateId, Float val) { + if (!isCloudEdge) { + LOGGER.debug("On-premise version is not sending data to SAP Cloud Platform Internet of Things"); + return; + } + if (val == null) { + LOGGER.error("No value to send"); + } else { + LOGGER.info("Sending data to streaming... {}{}", measuresUrl, device); + + // format the payload + String jsonPayload = String.format(IOTS_PACKET_FORMAT, sensorTypeAlternateId, capabilityAlternateId, + sensorAlternateId, "[" + val + "]"); + + LOGGER.info("Sending data: {}", jsonPayload); + byte[] byteArrayPayload = jsonPayload.getBytes(StandardCharsets.UTF_8); + int payloadLength = byteArrayPayload.length; + + try { + // create a connection to IOTS + URL url = new URL(measuresUrl + device); + URLConnection con = url.openConnection(); + HttpURLConnection http = (HttpURLConnection) con; + + // set the properties of the post + http.setRequestMethod("POST"); // PUT is another valid option + http.setDoOutput(true); + http.setFixedLengthStreamingMode(payloadLength); + http.setRequestProperty("Content-Type", "application/json; charset=UTF-8"); + + // connect and send data + http.connect(); + try (OutputStream os = http.getOutputStream()) { + os.write(byteArrayPayload); + } + } catch (Exception e) { + LOGGER.error("Could not stream transformed results back to streaming: {}", e.getMessage(), e); + } + } + } + + private static String mapToJsonString(Map doubles) { + String json = "{"; + List tmpJson = new ArrayList<>(doubles.size()); + // Convert single string + for (Map.Entry val : doubles.entrySet()) { + String jsonEntry = "\""; + // escape the invalid characters and remove the unsupported chars + jsonEntry += val.getKey().replaceAll("\"", "\\\"").replaceAll("[(]", "").replaceAll("[)]", ""); + jsonEntry += "\":\""; + jsonEntry += doubles.get(val.getKey()); + jsonEntry += "\""; + tmpJson.add(jsonEntry); + } + json += String.join(",", tmpJson); + json += "}"; + return json; + } +} diff --git a/predictive-pmml/src/main/resources/defaultConfiguration.json b/predictive-pmml/src/main/resources/defaultConfiguration.json new file mode 100644 index 0000000..f028d8d --- /dev/null +++ b/predictive-pmml/src/main/resources/defaultConfiguration.json @@ -0,0 +1,12 @@ +{ + "predictionSensorTypeAlternateId": "255", + "capabilityAlternateId": "color", + "predictionSensorAlternateId": "color sensor", + "predictionCapabilityAlternateId": "color prediction", + "predictionIndexCapabilityAlternateId": "validity color score", + "edgePlatformRestEndpoint": "http://localhost:8699/measures/", + "plantColorOutOfRangeLimit": "100", + "plantScalingForOutOfRange": "1.25", + "analysisFrequency": 10000, + "pmmlFileContentAsString": "" +} \ No newline at end of file diff --git a/predictive-pmml/src/main/resources/knn-color-model.pmml b/predictive-pmml/src/main/resources/knn-color-model.pmml new file mode 100644 index 0000000..fc944f7 --- /dev/null +++ b/predictive-pmml/src/main/resources/knn-color-model.pmml @@ -0,0 +1,918 @@ + + +
+ + 2019-10-22T13:57:10Z +
+ + PMMLPipeline(steps=[('estimator', KNeighborsClassifier(algorithm='auto', leaf_size=30, metric='minkowski', + metric_params=None, n_jobs=None, n_neighbors=3, p=2, + weights='uniform'))]) + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + Red + 205 + 92 + 92 + + + Red + 240 + 128 + 128 + + + Red + 250 + 128 + 114 + + + Red + 233 + 150 + 122 + + + Red + 255 + 160 + 122 + + + Red + 220 + 20 + 60 + + + Red + 255 + 0 + 0 + + + Red + 178 + 34 + 34 + + + Red + 139 + 0 + 0 + + + Pink + 255 + 192 + 203 + + + Pink + 255 + 182 + 193 + + + Pink + 255 + 105 + 180 + + + Pink + 255 + 20 + 147 + + + Pink + 199 + 21 + 133 + + + Pink + 219 + 112 + 147 + + + Orange + 255 + 160 + 122 + + + Orange + 255 + 127 + 80 + + + Orange + 255 + 99 + 71 + + + Orange + 255 + 69 + 0 + + + Orange + 255 + 140 + 0 + + + Orange + 255 + 165 + 0 + + + Yellow + 255 + 215 + 0 + + + Yellow + 255 + 255 + 0 + + + Yellow + 255 + 255 + 224 + + + Yellow + 255 + 250 + 205 + + + Yellow + 250 + 250 + 210 + + + Yellow + 255 + 239 + 213 + + + Yellow + 255 + 228 + 181 + + + Yellow + 255 + 218 + 185 + + + Yellow + 238 + 232 + 170 + + + Yellow + 240 + 230 + 140 + + + Yellow + 189 + 183 + 107 + + + Purple + 230 + 230 + 250 + + + Purple + 216 + 191 + 216 + + + Purple + 221 + 160 + 221 + + + Purple + 238 + 130 + 238 + + + Purple + 218 + 112 + 214 + + + Purple + 255 + 0 + 255 + + + Purple + 255 + 0 + 255 + + + Purple + 186 + 85 + 211 + + + Purple + 147 + 112 + 219 + + + Purple + 102 + 51 + 153 + + + Purple + 138 + 43 + 226 + + + Purple + 148 + 0 + 211 + + + Purple + 153 + 50 + 204 + + + Purple + 139 + 0 + 139 + + + Purple + 128 + 0 + 128 + + + Purple + 75 + 0 + 130 + + + Purple + 106 + 90 + 205 + + + Purple + 72 + 61 + 139 + + + Purple + 123 + 104 + 238 + + + Green + 173 + 255 + 47 + + + Green + 127 + 255 + 0 + + + Green + 124 + 252 + 0 + + + Green + 0 + 255 + 0 + + + Green + 50 + 205 + 50 + + + Green + 152 + 251 + 152 + + + Green + 144 + 238 + 144 + + + Green + 0 + 250 + 154 + + + Green + 0 + 255 + 127 + + + Green + 60 + 179 + 113 + + + Green + 46 + 139 + 87 + + + Green + 34 + 139 + 34 + + + Green + 0 + 128 + 0 + + + Green + 0 + 100 + 0 + + + Green + 154 + 205 + 50 + + + Green + 107 + 142 + 35 + + + Green + 128 + 128 + 0 + + + Green + 85 + 107 + 47 + + + Green + 102 + 205 + 170 + + + Green + 143 + 188 + 139 + + + Green + 32 + 178 + 170 + + + Green + 0 + 139 + 139 + + + Green + 0 + 128 + 128 + + + Blue + 0 + 255 + 255 + + + Blue + 0 + 255 + 255 + + + Blue + 224 + 255 + 255 + + + Blue + 175 + 238 + 238 + + + Blue + 127 + 255 + 212 + + + Blue + 64 + 224 + 208 + + + Blue + 72 + 209 + 204 + + + Blue + 0 + 206 + 209 + + + Blue + 95 + 158 + 160 + + + Blue + 70 + 130 + 180 + + + Blue + 176 + 196 + 222 + + + Blue + 176 + 224 + 230 + + + Blue + 173 + 216 + 230 + + + Blue + 135 + 206 + 235 + + + Blue + 135 + 206 + 250 + + + Blue + 0 + 191 + 255 + + + Blue + 30 + 144 + 255 + + + Blue + 100 + 149 + 237 + + + Blue + 123 + 104 + 238 + + + Blue + 65 + 105 + 225 + + + Blue + 0 + 0 + 255 + + + Blue + 0 + 0 + 205 + + + Blue + 0 + 0 + 139 + + + Blue + 0 + 0 + 128 + + + Blue + 25 + 25 + 112 + + + Brown + 255 + 248 + 220 + + + Brown + 255 + 235 + 205 + + + Brown + 255 + 228 + 196 + + + Brown + 255 + 222 + 173 + + + Brown + 245 + 222 + 179 + + + Brown + 222 + 184 + 135 + + + Brown + 210 + 180 + 140 + + + Brown + 188 + 143 + 143 + + + Brown + 244 + 164 + 96 + + + Brown + 218 + 165 + 32 + + + Brown + 184 + 134 + 11 + + + Brown + 205 + 133 + 63 + + + Brown + 210 + 105 + 30 + + + Brown + 139 + 69 + 19 + + + Brown + 160 + 82 + 45 + + + Brown + 165 + 42 + 42 + + + Brown + 128 + 0 + 0 + + + White + 255 + 255 + 255 + + + White + 255 + 250 + 250 + + + White + 240 + 255 + 240 + + + White + 245 + 255 + 250 + + + White + 240 + 255 + 255 + + + White + 240 + 248 + 255 + + + White + 248 + 248 + 255 + + + White + 245 + 245 + 245 + + + White + 255 + 245 + 238 + + + White + 245 + 245 + 220 + + + White + 253 + 245 + 230 + + + White + 255 + 250 + 240 + + + White + 255 + 255 + 240 + + + White + 250 + 235 + 215 + + + White + 250 + 240 + 230 + + + White + 255 + 240 + 245 + + + White + 255 + 228 + 225 + + + Gray + 220 + 220 + 220 + + + Gray + 211 + 211 + 211 + + + Gray + 192 + 192 + 192 + + + Gray + 169 + 169 + 169 + + + Gray + 128 + 128 + 128 + + + Gray + 105 + 105 + 105 + + + Gray + 119 + 136 + 153 + + + Gray + 112 + 128 + 144 + + + Gray + 47 + 79 + 79 + + + Gray + 0 + 0 + 0 + + + + + + + + + + + + +
diff --git a/predictive-python/LICENSE b/predictive-python/LICENSE new file mode 100644 index 0000000..8973bf7 --- /dev/null +++ b/predictive-python/LICENSE @@ -0,0 +1,41 @@ +SAP SAMPLE CODE LICENSE AGREEMENT + +Please scroll down and read the following SAP Sample Code License Agreement carefully ("Agreement"). By downloading, installing, or otherwise using the SAP sample code or any materials that accompany the sample code documentation (collectively, the "Sample Code"), You agree that this Agreement forms a legally binding agreement between You ("You" or "Your") and SAP SE, for and on behalf of itself and its subsidiaries and affiliates (as defined in Section 15 of the German Stock Corporation Act), and You agree to be bound by all of the terms and conditions stated in this Agreement. If You are trying to access or download the Sample Code on behalf of Your employer or as a consultant or agent of a third party (either "Your Company"), You represent and warrant that You have the authority to act on behalf of and bind Your Company to the terms of this Agreement and everywhere in this Agreement that refers to 'You' or 'Your' shall also include Your Company. If You do not agree to these terms, do not attempt to access or use the Sample Code. + +1. LICENSE: Subject to the terms of this Agreement, SAP grants You a non-exclusive, non-transferable, non-sublicensable, revocable, royalty-free, limited license to use, copy, and modify the Sample Code solely for Your internal business purposes. + +2. RESTRICTIONS: You must not use the Sample Code to: (a) impair, degrade or reduce the performance or security of any SAP products, services or related technology (collectively, "SAP Products"); (b) enable the bypassing or circumventing of SAP's license restrictions and/or provide users with access to the SAP Products to which such users are not licensed; or (c) permit mass data extraction from an SAP Product to a non-SAP Product, including use, modification, saving or other processing of such data in the non-SAP Product. Further, You must not: (i) provide or make the Sample Code available to any third party other than your authorized employees, contractors and agents (collectively, “Representatives”) and solely to be used by Your Representatives for Your own internal business purposes; ii) remove or modify any marks or proprietary notices from the Sample Code; iii) assign this Agreement, or any interest therein, to any third party; (iv) use any SAP name, trademark or logo without the prior written authorization of SAP; or (v) use the Sample Code to modify an SAP Product or decompile, disassemble or reverse engineer an SAP Product (except to the extent permitted by applicable law). You are responsible for any breach of the terms of this Agreement by You or Your Representatives. + +3. INTELLECTUAL PROPERTY: SAP or its licensors retain all ownership and intellectual property rights in and to the Sample Code and SAP Products. In exchange for the right to use, copy and modify the Sample Code provided under this Agreement, You covenant not to assert any intellectual property rights in or to any of Your products, services, or related technology that are based on or incorporate the Sample Code against any individual or entity in respect of any current or future SAP Products. + +4. SAP AND THIRD PARTY APIS: The Sample Code may include API (application programming interface) calls to SAP and third-party products or services. The access or use of the third-party products and services to which the API calls are directed may be subject to additional terms and conditions between you and SAP or such third parties. You (and not SAP) are solely responsible for understanding and complying with any additional terms and conditions that apply to the access or use of those APIs and/or third-party products and services. SAP does not grant You any rights in or to these APIs, products or services under this Agreement. + +5. FREE AND OPEN SOURCE COMPONENTS: The Sample Code may include third party free or open source components ("FOSS Components"). You may have additional rights in such FOSS Components that are provided by the third party licensors of those components. + +6. THIRD PARTY DEPENDENCIES: The Sample Code may require third party software dependencies ("Dependencies") for the use or operation of the Sample Code. These Dependencies may be identified by SAP in Maven POM files, documentation or by other means. SAP does not grant You any rights in or to such Dependencies under this Agreement. You are solely responsible for the acquisition, installation and use of such Dependencies. + +7. WARRANTY: +a) If You are located outside the US or Canada: AS THE SAMPLE CODE IS PROVIDED TO YOU FREE OF CHARGE, SAP DOES NOT GUARANTEE OR WARRANT ANY FEATURES OR QUALITIES OF THE SAMPLE CODE OR GIVE ANY UNDERTAKING WITH REGARD TO ANY OTHER QUALITY. NO SUCH WARRANTY OR UNDERTAKING SHALL BE IMPLIED BY YOU FROM ANY DESCRIPTION IN THE SAMPLE CODE OR ANY OTHER MATERIALS, COMMUNICATION OR ADVERTISEMENT. IN PARTICULAR, SAP DOES NOT WARRANT THAT THE SAMPLE CODE WILL BE AVAILABLE UNINTERRUPTED, ERROR FREE, OR PERMANENTLY AVAILABLE. ALL WARRANTY CLAIMS RESPECTING THE SAMPLE CODE ARE SUBJECT TO THE LIMITATION OF LIABILITY STIPULATED IN SECTION 8 BELOW. +b) If You are located in the US or Canada: THE SAMPLE CODE IS LICENSED TO YOU "AS IS", WITHOUT ANY WARRANTY, ESCROW, TRAINING, MAINTENANCE, OR SERVICE OBLIGATIONS WHATSOEVER ON THE PART OF SAP. SAP MAKES NO EXPRESS OR IMPLIED WARRANTIES OR CONDITIONS OF SALE OF ANY TYPE WHATSOEVER, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE. IN PARTICULAR, SAP DOES NOT WARRANT THAT THE SAMPLE CODE WILL BE AVAILABLE UNINTERRUPTED, ERROR FREE, OR PERMANENTLY AVAILABLE. YOU ASSUME ALL RISKS ASSOCIATED WITH THE USE OF THE SAMPLE CODE, INCLUDING WITHOUT LIMITATION RISKS RELATING TO QUALITY, AVAILABILITY, PERFORMANCE, DATA LOSS, AND UTILITY IN A PRODUCTION ENVIRONMENT. +c) For all locations: SAP DOES NOT MAKE ANY REPRESENTATIONS OR WARRANTIES IN RESPECT OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE. IN PARTICULAR, SAP DOES NOT WARRANT THAT THIRD-PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES WILL BE AVAILABLE, ERROR FREE, INTEROPERABLE WITH THE SAMPLE CODE, SUITABLE FOR ANY PARTICULAR PURPOSE OR NON-INFRINGING. YOU ASSUME ALL RISKS ASSOCIATED WITH THE USE OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES, INCLUDING WITHOUT LIMITATION RISKS RELATING TO QUALITY, AVAILABILITY, PERFORMANCE, DATA LOSS, UTILITY IN A PRODUCTION ENVIRONMENT, AND NON-INFRINGEMENT. IN NO EVENT WILL SAP BE LIABLE DIRECTLY OR INDIRECTLY IN RESPECT OF ANY USE OF THIRD PARTY DEPENDENCIES, APIS, PRODUCTS AND SERVICES BY YOU. + +8. LIMITATION OF LIABILITY: +a) If You are located outside the US or Canada: IRRESPECTIVE OF THE LEGAL REASONS, SAP SHALL ONLY BE LIABLE FOR DAMAGES UNDER THIS AGREEMENT IF SUCH DAMAGE (I) CAN BE CLAIMED UNDER THE GERMAN PRODUCT LIABILITY ACT OR (II) IS CAUSED BY INTENTIONAL MISCONDUCT OF SAP OR (III) CONSISTS OF PERSONAL INJURY. IN ALL OTHER CASES, NEITHER SAP NOR ITS EMPLOYEES, AGENTS AND SUBCONTRACTORS SHALL BE LIABLE FOR ANY KIND OF DAMAGE OR CLAIMS HEREUNDER. +b) If You are located in the US or Canada: IN NO EVENT SHALL SAP BE LIABLE TO YOU, YOUR COMPANY OR TO ANY THIRD PARTY FOR ANY DAMAGES IN AN AMOUNT IN EXCESS OF $100 ARISING IN CONNECTION WITH YOUR USE OF OR INABILITY TO USE THE SAMPLE CODE OR IN CONNECTION WITH SAP'S PROVISION OF OR FAILURE TO PROVIDE SERVICES PERTAINING TO THE SAMPLE CODE, OR AS A RESULT OF ANY DEFECT IN THE SAMPLE CODE. THIS DISCLAIMER OF LIABILITY SHALL APPLY REGARDLESS OF THE FORM OF ACTION THAT MAY BE BROUGHT AGAINST SAP, WHETHER IN CONTRACT OR TORT, INCLUDING WITHOUT LIMITATION ANY ACTION FOR NEGLIGENCE. YOUR SOLE REMEDY IN THE EVENT OF BREACH OF THIS AGREEMENT BY SAP OR FOR ANY OTHER CLAIM RELATED TO THE SAMPLE CODE SHALL BE TERMINATION OF THIS AGREEMENT. NOTWITHSTANDING ANYTHING TO THE CONTRARY HEREIN, UNDER NO CIRCUMSTANCES SHALL SAP OR ITS LICENSORS BE LIABLE TO YOU OR ANY OTHER PERSON OR ENTITY FOR ANY SPECIAL, INCIDENTAL, CONSEQUENTIAL, OR INDIRECT DAMAGES, LOSS OF GOOD WILL OR BUSINESS PROFITS, WORK STOPPAGE, DATA LOSS, COMPUTER FAILURE OR MALFUNCTION, ANY AND ALL OTHER COMMERCIAL DAMAGES OR LOSS, OR EXEMPLARY OR PUNITIVE DAMAGES. + +9. INDEMNITY: You will fully indemnify, hold harmless and defend SAP against law suits based on any claim: (a) that any of Your products, services or related technology that are based on or incorporate the Sample Code infringes or misappropriates any patent, copyright, trademark, trade secrets, or other proprietary rights of a third party, or (b) related to Your alleged violation of the terms of this Agreement. + +10. EXPORT: The Sample Code is subject to German, EU and US export control regulations. You confirm that: a) You will not use the Sample Code for, and will not allow the Sample Code to be used for, any purposes prohibited by German, EU and US law, including, without limitation, for the development, design, manufacture or production of nuclear, chemical or biological weapons of mass destruction; b) You are not located in Cuba, Iran, Sudan, Iraq, North Korea, Syria, nor any other country to which the United States has prohibited export or that has been designated by the U.S. Government as a "terrorist supporting" country (any, an "US Embargoed Country"); c) You are not a citizen, national or resident of, and are not under the control of, a US Embargoed Country; d) You will not download or otherwise export or re-export the Sample Code, directly or indirectly, to a US Embargoed Country nor to citizens, nationals or residents of a US Embargoed Country; e) You are not listed on the United States Department of Treasury lists of Specially Designated Nationals, Specially Designated Terrorists, and Specially Designated Narcotic Traffickers, nor listed on the United States Department of Commerce Table of Denial Orders or any other U.S. government list of prohibited or restricted parties and f) You will not download or otherwise export or re-export the Sample Code, directly or indirectly, to persons on the above-mentioned lists. + +11. SUPPORT: SAP does not offer support for the Sample Code. + +12. TERM AND TERMINATION: You may terminate this Agreement by destroying all copies of the Sample Code in Your possession or control. SAP may terminate Your license to use the Sample Code immediately if You fail to comply with any of the terms of this Agreement, or, for SAP's convenience by providing you with ten (10) days written notice of termination. In case of termination or expiration of this Agreement, You must immediately destroy all copies of the Sample Code in your possession or control. In the event Your Company is acquired (by merger, purchase of stock, assets or intellectual property or exclusive license), or You become employed, by a direct competitor of SAP, then this Agreement and all licenses granted to You in this Agreement shall immediately terminate upon the date of such acquisition or change of employment. + +13. LAW/VENUE: +a) If You are located outside the US or Canada: This Agreement is governed by and construed in accordance with the laws of Germany without reference to its conflicts of law principles. You and SAP agree to submit to the exclusive jurisdiction of, and venue in, the courts located in Karlsruhe, Germany in any dispute arising out of or relating to this Agreement or the Sample Code. The United Nations Convention on Contracts for the International Sale of Goods shall not apply to this Agreement. +b) If You are located in the US or Canada: This Agreement shall be governed by and construed in accordance with the laws of the State of New York, USA without reference to its conflicts of law principles. You and SAP agree to submit to the exclusive jurisdiction of, and venue in, the courts located in New York, New York, USA in any dispute arising out of or relating to this Agreement or the Sample Code. The United Nations Convention on Contracts for the International Sale of Goods shall not apply to this Agreement. + +14. MISCELLANEOUS: This Agreement is the complete agreement between the parties respecting the Sample Code. This Agreement supersedes all prior or contemporaneous agreements or representations with regards to the Sample Code. If any term of this Agreement is found to be invalid or unenforceable, the surviving provisions shall remain effective. SAP's failure to enforce any right or provisions stipulated in this Agreement will not constitute a waiver of such provision, or any other provision of this Agreement. + + +v1.0-071618 diff --git a/predictive-python/NOTICE b/predictive-python/NOTICE new file mode 100644 index 0000000..d3119b2 --- /dev/null +++ b/predictive-python/NOTICE @@ -0,0 +1 @@ +Copyright (c) 2020 SAP SE or an SAP affiliate company. All rights reserved. diff --git a/predictive-python/README.md b/predictive-python/README.md new file mode 100644 index 0000000..c975764 --- /dev/null +++ b/predictive-python/README.md @@ -0,0 +1,243 @@ +# Predictive Service (Python) Sample + +## Overview +The implemented scenario is documented [here](https://blogs.sap.com/2019/11/05/implement-predictive-analytics-at-the-edge/) + +## Product Documentation + +Product Documentation for SAP Edge Services is available as follows: + +[SAP Edge Services, cloud edition](https://help.sap.com/viewer/p/EDGE_SERVICES) + +### Description + +On an interval, this execute a prediction based on the KNN algorithm, to identify if the measured color is an expected color. + +The predictions (both punctual and a global index) values are then fed back into the IoT Services Gateway Edge via REST as a different capability. This capability is then visible in the IoT Services Cockpit. + +### Deploying this sample + +This sample is packaged as an OSGI bundle. It is deployed to SAP Edge Services, cloud edition using a Custom Service defined within the Policy Service of SAP Edge Services. + +## Requirements + +The following must be installed for this sample: +1. Java JDK 1.8 or above (https://www.java.com/en/download/) +2. Apache Maven (https://maven.apache.org/download.cgi) +3. Git command line tool (https://git-scm.com/downloads) +4. SAP Edge Services (Cloud or On-premise edition) +5. Java ZeroMQ Libraries (https://github.com/zeromq/jeromq/releases) + +### SAP Edge Services, cloud edition + +For cloud edition, a working IoT Services Gateway Edge (REST) is required, with the SAP Edge Services Persistence Service installed. + +The following needs to be setup on IoT Services as a data model to permit the predictive module to analyze correctly the data and to send the results back into the system. To create the entries, login to the IoT Services cockpit with the same tenant that your gateway uses. + +1. Create the capabilities +- **capabilityAlternateId:** color +- **properties:** + +| Property Name | Property Type | +|:-------------: |:-------------: | +| R | float | +| G | float | +| B | float | +--- +- **capabilityAlternateId:** color prediction +- **properties:** + +| Property Name | Property Type | +|:-------------: |:-------------: | +| label | string | +| neighbor1 | float | +| neighbor2 | float | +| neighbor3 | float | +--- +- **capabilityAlternateId:** validity color score +- **properties:** + +| Property Name | Property Type | +|:-------------: |:-------------: | +| index | float | + +2. Create the sensor type +- **sensorType name:** color sensor type +- **sensorTypeAlternateId:** 255 + +3. Add all the capabilities into the **_color sensor type_** Sensor Type + +## Download and Installation + +### Download the sample app +```json +git clone https://github.com/SAP/iot-edge-services-samples.git +cd iot-edge-services-samples +cd predictive-python +``` + +### Download the SAP Edge Service dependencies bundles and add to Maven + +#### SAP Edge Services Persistence Service + +1. Ensure that from the Policy Service, the Persistence Service is installed on your gateway. +2. Access the files of the device running the IoT Services Gateway Edge +3. cd /gateway_folder/custombundles +4. Copy the file PersistenceService-3.1912.0.jar to the project root of this sample. + NOTE: the version number may change, in which case, the version number in the pom.xml file will need to be updated +5. From root directory of this sample, execute the following command: +```json +mvn install:install-file -Dfile=PersistenceService-3.1912.0.jar -DgroupId=com.sap.iot.edgeservices -DartifactId=PersistenceService -Dversion=3.1912.0 -Dpackaging=jar +``` + NOTE: if the version number has changed, substitute 3.1912.0 in the above command for the appropriate version number as found in the filename. + +#### SAP Edge Services Configuration Service + +1. Ensure that from the Policy Service, the Persistence Service is installed on your gateway. +2. Access the files of the device running the IoT Services Gateway Edge +3. cd /gateway_folder/custombundles +4. Copy the file ConfigService-3.1912.0.jar to the project root of this sample. + NOTE: the version number may change, in which case, the version number in the pom.xml file will need to be updated +5. From root directory of this sample, execute the following command: +```json +mvn install:install-file -Dfile=ConfigService-3.1912.0.jar -DgroupId=com.sap.iot.edgeservices -DartifactId=ConfigService -Dversion=3.1912.0 -Dpackaging=jar +``` + NOTE: if the version number has changed, substitute 3.1912.0 in the above command for the appropriate version number as found in the filename. + +### Customize the source + +You can change the pmml model and some configuration parameters dynamically. +Open the file +src\main\resources\defaultConfiguration.json +create and deploy a new configuration for this service within the Policy Service. +in the body of the configuration put a JSON that contains the parameter that you would like to change. The change is not incremental. + +#### SAP Edge Services, cloud edition + +By default, the sample works directly with SAP Edge Services, cloud edition and nothing needs to be changed. + +#### SAP Edge Services, on-premise edition + +This example, with some modifications, could works with SAP Edge Services, on-premise edition. Some of them are already controlled with the flag _**CLOUD_EDGE_SERVICES**_ + +Edit the file +```json +src\main\java\com\sap\iot\edgeservices\predictive\sample\custom\PredictValue.java +``` +In the definition, set **_CLOUD_EDGE_SERVICES = false_** (line 39) +```json + private static Boolean CLOUD_EDGE_SERVICES = false; //SET TO false for ON-PREMISE +``` + +### Compile and Package + +1. Open a shell / command prompt (on Windows as Administrator) and navigate to the `predictive-python` directory. +2. Edit the provided pom.xml and ensure that the version number of the Persistence Service and ConfigService jar files matches the JSON. If it does not match, change the numbers in the pom.xml +```json + + com.sap.iot.edgeservices + PersistenceService + 3.1912.0 + provided + + + com.sap.iot.edgeservices + ConfigService + 3.1912.0 + provided + +``` +3. Run following command to compile and build the package: +```json +mvn clean install +``` +4. Verify that the file PredictiveModel-1.0.0.jar was created in the /target folder. + +### Satisfy the dependencies + +The following inherited dependencies must be satisfied by installing the OSGi versions of the following jar files: +- jeromq-0.5.1.jar +- jnacl-1.0.1-SNAPSHOT.jar + +### Deploy + +#### SAP Edge Services, cloud edition + +1. Use the SAP Edge Services Policy service, navigate to the Services list and create a new custom service. +2. Use "RGBSERVICE" for the event topic field (or what you have defined at line 56 of the file src\main\java\com\sap\iot\edgeservices\predictive\sample\PredictiveModule.java +3. Use the file /target/PredictiveModel-1.0.0.jar file. +4. Save it. +5. Go in the Gateways and Group of Gateways list and search for your gateway in the list +6. Deploy the created custom service + +### Deploy Configurations + +If needed you can create and use a custom configuration for the service within the Policy Service. The body of the configuration is a JSON object; these are the default values: +```json +{ + "predictionSensorTypeAlternateId": "255", + "capabilityAlternateId": "color", + "predictionSensorAlternateId": "color sensor", + "predictionCapabilityAlternateId": "color prediction", + "predictionIndexCapabilityAlternateId": "validity color score", + "edgePlatformRestEndpoint": "http://localhost:8699/measures/", + "plantColorOutOfRangeLimit": "100", + "plantScalingForOutOfRange": "1.25", + "analysisFrequency": 10000, + "predictionOutputFields": [ + "neighbor(1)", + "neighbor(2)", + "neighbor(3)" + ], + "brokerConnectionAddressPort": "127.0.0.1:5555", + "pythonRuntimeExecutionCommand": "cmd /c python", + "pythonScriptPath": "./trainModel.py" +} +``` +If a new configuration is uploaded the old configuration is discarded (it's not incremental). The unspecified values are replaced with the default values. + +## Run + +### SAP Edge Services, cloud edition + +1. Use a supported method to send data to IoT Services Gateway Edge. For example, send data to the SAP IoT Services Gateway Edge using a tool like Postman. +```json +URL: http://:8699/measures/colordevice +HEADERS: Content-type: application/json +BODY: { + "capabilityAlternateId": "color", + "sensorTypeAlternateId": "255", + "sensorAlternateId": "color sensor", + "measures": [{ + "R": "235", + "G": "64", + "B": "52" + }] + } +``` +To actually see the predicted values created correctly, read the measurements inside the other capabilities. + +2. Login to the IoT Services Cockpit +3. Navigate to your gateway +4. select the device for color device +5. graph the results for +```json +sensorAlternateId: color sensor +capabilityAlternateId: color prediction +``` +and +```json +sensorAlternateId: color sensor +capabilityAlternateId: validity color score +``` + +## How to obtain support + +These samples are provided "as-is" basis with detailed documentation on how to use them. + + +## Copyright and License + +Copyright (c) 2020 SAP SE or an SAP affiliate company. All rights reserved. + +License provided by [SAP SAMPLE CODE LICENSE AGREEMENT](https://github.com/SAP-samples/iot-edge-services-samples/blob/master/predictive-python/LICENSE) diff --git a/predictive-python/pom.xml b/predictive-python/pom.xml new file mode 100644 index 0000000..d499cb4 --- /dev/null +++ b/predictive-python/pom.xml @@ -0,0 +1,153 @@ + + + 4.0.0 + com.sap.iot.edgeservices.persistence.sample + PredictiveModel + 1.0.0 + + ${project.artifactId} + ${project.version} + + + + com.sap.iot.edgeservices + PersistenceService + 3.1912.0 + bundle + provided + + + com.sap.iot.edgeservices + ConfigService + 3.1912.0 + bundle + provided + + + org.osgi + org.osgi.core + 4.3.1 + provided + + + org.osgi + org.osgi.service.component.annotations + 1.3.0 + + + org.apache.logging.log4j + log4j-osgi + 2.9.0 + provided + + + org.slf4j + slf4j-api + 1.7.29 + provided + + + commons-lang + commons-lang + 2.6 + provided + + + org.osgi + org.osgi.service.event + 1.4.0 + provided + + + org.apache.commons + commons-math3 + 3.6.1 + compile + + + org.zeromq + jeromq + 0.5.1 + provided + + + org.zeromq + jnacl + 0.1.0 + provided + + + com.fasterxml.jackson.core + jackson-core + 2.9.7 + provided + + + com.fasterxml.jackson.core + jackson-databind + 2.9.10.3 + provided + + + commons-io + commons-io + 2.2 + provided + + + + + + org.apache.maven.plugins + maven-compiler-plugin + 3.3 + + 1.8 + 1.8 + + + + org.apache.maven.plugins + maven-jar-plugin + + + META-INF/MANIFEST.MF + + + + + + org.apache.felix + maven-bundle-plugin + true + + + + scr-metadata + + manifest + + + true + + + + + true + META-INF + + ${bundle.symbolicName} + + <_dsannotations>* + + <_metatypeannotations>* + + + + + + diff --git a/predictive-python/python model/red-colors.json b/predictive-python/python model/red-colors.json new file mode 100644 index 0000000..f1abffc --- /dev/null +++ b/predictive-python/python model/red-colors.json @@ -0,0 +1,74 @@ +[ + { + "label": "Red", + "data": [ + 205, + 92, + 92 + ] + }, + { + "label": "Red", + "data": [ + 240, + 128, + 128 + ] + }, + { + "label": "Red", + "data": [ + 250, + 128, + 114 + ] + }, + { + "label": "Red", + "data": [ + 233, + 150, + 122 + ] + }, + { + "label": "Red", + "data": [ + 255, + 160, + 122 + ] + }, + { + "label": "Red", + "data": [ + 220, + 20, + 60 + ] + }, + { + "label": "Red", + "data": [ + 255, + 0, + 0 + ] + }, + { + "label": "Red", + "data": [ + 178, + 34, + 34 + ] + }, + { + "label": "Red", + "data": [ + 139, + 0, + 0 + ] + } +] \ No newline at end of file diff --git a/predictive-python/python model/test.py b/predictive-python/python model/test.py new file mode 100644 index 0000000..b6908e2 --- /dev/null +++ b/predictive-python/python model/test.py @@ -0,0 +1,24 @@ +import json +import os.path +import time + +# Third-party libraries +import zmq + +context = zmq.Context() +socket = context.socket(zmq.REQ) +socket.connect("tcp://localhost:5555") + +socket.send(b"hello") + +message = socket.recv() +print(message) + +while True: + # Wait for next request from client + jsonStr = '{"measures":{"R":255.0, "G":125.0, "B":64}}' + socket.send(jsonStr.encode('ascii')) + # Do some 'work' + time.sleep(1) + message = socket.recv() + print("Received request: %s" % message) \ No newline at end of file diff --git a/predictive-python/python model/train_model.py b/predictive-python/python model/train_model.py new file mode 100644 index 0000000..a2cf9d2 --- /dev/null +++ b/predictive-python/python model/train_model.py @@ -0,0 +1,100 @@ +# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # +# Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. +# The sample is not intended for production use. Provided "as is". +# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # + +''' +This program implements a KNN algorithm to recognize the color of in input sample, +in terms of RGB components. The model is trained within a dataset of red-scale, +so it able to recognize what colours are red, and provide as output the euclidean distances +of the top 3 nearest neighbors. +''' +import json +import os.path + +# Third-party libraries +import sklearn.neighbors +import zmq + +# This folder must contain the model data +DATA_DIR = '.' +TRAINING_DATASET_FILENAME = os.path.join(DATA_DIR, 'red-colors.json') + +def predict(array_data, labels, knn_classifier): + '''Predict the color name by its RGB components + Args: + array_data (list): a list of 3 integer elements: R, G, B. Each element must be in the 0..255 range. + labels (list): a list of labels that come from the dataset. + knn_classifier (object): The classifier used to make the prediction. + Returns: + A dictionary with neighbor names as keys and distances from the RGB point to each neighbor as values + ''' + predicted = knn_classifier.predict(array_data) + print(predicted) + distances,indexes = knn_classifier.kneighbors(array_data) + print(distances) + print([labels[i] for i in indexes[0]]) + print(indexes) + data = {} + data['label'] = predicted[0] + data['neighbor(1)'] = distances[0][0] + data['neighbor(2)'] = distances[0][1] + data['neighbor(3)'] = distances[0][2] + print(data) + return data + +def zmq_start_server(port): + ''' + Strart ZMQ messagebus at the specified port (binded to any available ip) + + Args: + port (string): A string with the port used to bind the socket at the server side. + ''' + context = zmq.Context() + socket = context.socket(zmq.REP) + socket.bind("tcp://*:" + port) + return socket + +def main(): + ''' + The entry point of the predictive algorithm + ''' + f = open(TRAINING_DATASET_FILENAME) + dataset = json.load(f) + f.close() + points = [el['data'] for el in dataset] + labels = [el['label'] for el in dataset] + knn_classifier = sklearn.neighbors.KNeighborsClassifier(3) + knn_classifier.fit(points, labels) + + # Create the server + socket = zmq_start_server("5555") + # Process messages + while True: + # Wait for next request from client + message = socket.recv() + print("Received request: %s" % message) + if(message == b'hello'): + socket.send(b'hello') + continue + + # Parse Json + try: + objSample = json.loads(message.decode('utf-8')) + rgbSamples = [] + print(objSample) + rgbSamples.append([objSample['measures']['R'],objSample['measures']['G'],objSample['measures']['B']]) + # Do prediction + prediction = predict(rgbSamples, labels, knn_classifier) + + # Create json + jsonprediction = json.dumps(prediction) + print(jsonprediction) + # Send reply back to client + socket.send((jsonprediction.encode('utf-8'))) + except Exception as e: + print(e) + +if __name__ == '__main__': + # Run the main process + main() \ No newline at end of file diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/Calculation.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/Calculation.java new file mode 100644 index 0000000..185f674 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/Calculation.java @@ -0,0 +1,42 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample; + +import com.sap.iot.edgeservices.predictive.sample.db.PersistenceClient; + +public abstract class Calculation +implements Runnable { + + protected final PersistenceClient persistenceClient; // persistence client used by the bundle + protected State state = State.NOT_INITIALIZED; // bundle state + + //////////////////// + // constructors + //////////////////// + + public Calculation(PersistenceClient persistenceClient) { + this.persistenceClient = persistenceClient; + initialize(); + } + + //////////////////// + // public abstract functions + //////////////////// + + public abstract void stopGracefully(); + + //////////////////// + // protected abstract functions + //////////////////// + + protected abstract void initialize(); + + public enum State { + NOT_INITIALIZED, + RUNNING, + ERROR + } + +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/Engine.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/Engine.java new file mode 100644 index 0000000..24a3c03 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/Engine.java @@ -0,0 +1,87 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample; + +import java.util.concurrent.Executors; +import java.util.concurrent.ScheduledExecutorService; +import java.util.concurrent.ScheduledFuture; +import java.util.concurrent.TimeUnit; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +/** + * The Engine will continuously run the calculation + */ +public class Engine +extends Thread { + + private static final Logger LOGGER = LoggerFactory.getLogger(Engine.class); // logger + + //////////////////// + // class fields + //////////////////// + + private final Calculation calculation; // the parent interface to the calculation class + private final ScheduledExecutorService threadPool = Executors.newScheduledThreadPool(1); // thread scheduler for the + // calculation + private ScheduledFuture thread; // current calculation + private long calculationFrequencyMS; // scheduler frequency + + //////////////////// + // Constructors + //////////////////// + + /** + * Ctor for the engine. + * + * @param calculation + * calculation object + * @param calculationFrequencyMS + * thread frequency + */ + Engine(Calculation calculation, long calculationFrequencyMS) { + LOGGER.debug("ctor - called"); + this.calculation = calculation; + this.calculationFrequencyMS = calculationFrequencyMS; + initialize(); + } + + //////////////////// + // Public methods + //////////////////// + + @Override + public void run() { + LOGGER.info("run - called"); + try { + thread = threadPool.scheduleAtFixedRate(calculation, 0, calculationFrequencyMS, TimeUnit.MILLISECONDS); + } catch (Exception e) { + LOGGER.error("Problem executing the calculation: {}", e.getMessage(), e); + } + + } + + /** + * stop the service + */ + void stopGracefully() { + LOGGER.info("stopGracefully - called"); + calculation.stopGracefully(); + thread.cancel(true); + } + + //////////////////// + // private methods + //////////////////// + + private void initialize() { + LOGGER.debug("initialize - called"); + + // the calculation class will create any tables it needs + // here is for anything the engine needs (profiling etc) + } + +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/PersistenceException.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/PersistenceException.java new file mode 100644 index 0000000..6096ae7 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/PersistenceException.java @@ -0,0 +1,63 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample; + +public class PersistenceException +extends Exception { + public static final String CODE_AUTHENTICATION_INVALID_CREDENTIALS = "CODE_AUTHENTICATION_INVALID_CREDENTIALS"; // error + // code + // credentials + public static final String CODE_SINGLETON_NOT_FOUND = "CODE_SINGLETON_NOT_FOUND"; // error code missing singleton + private static final long serialVersionUID = -1230356990632230194L; // autogenerated id + private final String code; // current code + + // constructor + public PersistenceException(String code) { + super(); + this.code = code; + } + + /** + * @param message + * message of the exception + * @param cause + * cause of the exception + * @param code + * code of the exception + */ + public PersistenceException(String message, Throwable cause, String code) { + super(message, cause); + this.code = code; + } + + /** + * @param message + * message of the exception + * @param code + * code of the exception + */ + public PersistenceException(String message, String code) { + super(message); + this.code = code; + } + + /** + * @param cause + * cause of the exception + * @param code + * code of the exception + */ + public PersistenceException(Throwable cause, String code) { + super(cause); + this.code = code; + } + + /** + * @return the code + */ + public String getCode() { + return this.code; + } +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/PredictiveModuleActivator.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/PredictiveModuleActivator.java new file mode 100644 index 0000000..03d5262 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/PredictiveModuleActivator.java @@ -0,0 +1,286 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample; + +import java.io.File; +import java.io.IOException; +import java.nio.charset.StandardCharsets; +import java.nio.file.Files; +import java.util.Dictionary; +import java.util.Hashtable; +import java.util.Optional; + +import org.apache.commons.lang.StringUtils; +import org.osgi.framework.BundleActivator; +import org.osgi.framework.BundleContext; +import org.osgi.framework.ServiceEvent; +import org.osgi.framework.ServiceListener; +import org.osgi.service.component.annotations.Activate; +import org.osgi.service.component.annotations.Component; +import org.osgi.service.component.annotations.Deactivate; +import org.osgi.service.component.annotations.Reference; +import org.osgi.service.component.annotations.ReferenceCardinality; +import org.osgi.service.component.annotations.ReferencePolicy; +import org.osgi.service.event.Event; +import org.osgi.service.event.EventConstants; +import org.osgi.service.event.EventHandler; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.sap.iot.edgeservices.configservice.service.IConfigStatusService; +import com.sap.iot.edgeservices.persistenceservice.service.IPersistenceService; +import com.sap.iot.edgeservices.predictive.sample.custom.PredictValues; +import com.sap.iot.edgeservices.predictive.sample.db.PersistenceClient; +import com.sap.iot.edgeservices.predictive.sample.proxies.ConfigurationFields; +import com.sap.iot.edgeservices.predictive.sample.proxies.CustomConfiguration; +import com.sap.iot.edgeservices.predictive.sample.utilities.ConfigurationHandler; + +/** + * This class is the entry point of the OSGi Bundle + * + * It also creates the Engine which is on a timer thread. The Engine is responsible for executing the Calculation. + * + * The Calculation is an interface, where the implementing class has the actual logic that is executed. In this example, + * the CalculateAverages class implements Calculation, and is executed by the Engine every 5 seconds. + * + * The PersistenceClient provides access to the Persistence Service. + * + */ +@Component(immediate = true) +public class PredictiveModuleActivator +implements BundleActivator, ServiceListener, EventHandler { + + private static final Logger LOGGER = LoggerFactory.getLogger(PredictiveModuleActivator.class); // logger + private static final String EVENT_TOPIC = "RGBSERVICE"; // The Event Admin topic to subscribe to for config + + //////////////////// + // class fields + //////////////////// + private static volatile CustomConfiguration configuration; // Custom configuration object dynamically loaded + private static IConfigStatusService configStatusService; // Reference for the configuration service + private static IPersistenceService service; // handle to the Persistence Service which this sample depends on + private static BundleContext bundleContext; // bundle context of this bundle + private static Engine engine; // custom class that executes logic on a timer + private static PersistenceClient persistenceClient; // helper class to access the Persistence Service + private static volatile String lastSuccessfulFingerprint; // fingerprint of the configuration + // private IPersistenceService persistenceService; //persistence service ref + + /** + * remove persistence reference + */ + private static void removePersistenceReference() { + PredictiveModuleActivator.service = null; + } + + /** + * @param bundleContext + * bundle context + */ + private static void setBundleContext(BundleContext bundleContext) { + PredictiveModuleActivator.bundleContext = bundleContext; + } + + public static String getLastSuccessfulFingerprint() { + return lastSuccessfulFingerprint; + } + + /** + * @param lastSuccessfulFingerprint + * last fingerprint + */ + private static void setLastSuccessfulFingerprint(String lastSuccessfulFingerprint) { + PredictiveModuleActivator.lastSuccessfulFingerprint = lastSuccessfulFingerprint; + } + + public static CustomConfiguration getConfiguration() { + return configuration; + } + + /** + * @param conf + * currenct active configuration + */ + private static void setConfiguration(CustomConfiguration conf) { + PredictiveModuleActivator.configuration = conf; + } + + /** + * initialize the persistence client + */ + private static void initPersistenceService() { + LOGGER.debug("---- PersistenceSampleActivator.setPersistenceService"); + LOGGER.debug("---- service = {}", PredictiveModuleActivator.service); + LOGGER.debug("---- context = {}", PredictiveModuleActivator.bundleContext); + + try { + PredictiveModuleActivator.persistenceClient = new PersistenceClient(PredictiveModuleActivator.service, + PredictiveModuleActivator.bundleContext); + } catch (PersistenceException e) { + LOGGER.error("Could not get token for database. Engine not started due to {}", e.getMessage(), e); + LOGGER.error("Persistence sample is not running."); + return; + } + PredictValues predictValues = new PredictValues(PredictiveModuleActivator.persistenceClient); + PredictiveModuleActivator.engine = new Engine(predictValues, configuration.getAnalysisFrequency()); + PredictiveModuleActivator.engine.start(); + } + + /** + * initialize the configuration object + */ + private static void initConfiguration() { + LOGGER.debug("Configuration is using topic: {}", EVENT_TOPIC); + // load configuration from file or use a default configuration + CustomConfiguration defaultConfig = ConfigurationHandler.loadDefaultConfiguration(); + setConfiguration(ConfigurationHandler.loadConfigurationFromDisk(defaultConfig, EVENT_TOPIC)); + setLastSuccessfulFingerprint(ConfigurationHandler.getLastFingerprint()); + // fallback to default + if (configuration == null) { + LOGGER.debug("Starting with default configuration"); + setConfiguration(defaultConfig); + setLastSuccessfulFingerprint(null); + } + } + + //////////////////// + // public methods + //////////////////// + /* + * this function is called by OSGi when the bundle loads and starts + */ + @Activate + public void start(BundleContext bundleContext) + throws Exception { + LOGGER.debug("---- PersistenceSampleActivator.start"); + Dictionary properties = new Hashtable<>(); // NOSONAR + // Register this class to listen over Event Admin for activation requests with the topic EVENT_TOPIC + properties.put(EventConstants.EVENT_TOPIC, EVENT_TOPIC); + bundleContext.registerService(EventHandler.class, this, properties); + PredictiveModuleActivator.setBundleContext(bundleContext); + // init configuration + initConfiguration(); + // init persistence + initPersistenceService(); + LOGGER.info("---- {} initialization success", this.getClass()); + + } + + /* + * (non-Javadoc) + * + * @see org.osgi.framework.BundleActivator#stop(org.osgi.framework.BundleContext) + */ + @Deactivate + public void stop(BundleContext context) + throws Exception { + LOGGER.debug("---- PersistenceSampleActivator.stop"); + PredictiveModuleActivator.engine.stopGracefully(); + PredictiveModuleActivator.removePersistenceReference(); + PredictiveModuleActivator.setBundleContext(null); + } + + /** + * If the Persistence Service changes (the underlying bundle swaps out the implementation) then we could reconnect + * without change. This is beyond the scope of this sample. + */ + @Override + public void serviceChanged(ServiceEvent arg0) { + LOGGER.debug("---- PersistenceSample:PersistenceSampleActivator.serviceChanged - no operation performed."); + } + + /** + * @param event + * handle the event to get new configurations + */ + @Override + public void handleEvent(Event event) { + // Check to see if the event received conforms to a config activation event + // i.e. the event contains the config file to be activated and its associated fingerprint + if (event.getProperty(ConfigurationFields.configFile.name()) instanceof File && + event.getProperty(ConfigurationFields.configFingerprint.name()) instanceof String) { + File configFile = (File) event.getProperty(ConfigurationFields.configFile.name()); + String fingerprint = (String) event.getProperty(ConfigurationFields.configFingerprint.name()); + + // Return if the sent config file has already been activated + if (!StringUtils.isEmpty(lastSuccessfulFingerprint) && lastSuccessfulFingerprint.equals(fingerprint)) { + return; + } + + getConfigStatusService().ifPresent(cfgStatusService -> { + try { + String configFileContents = new String(Files.readAllBytes(configFile.toPath()), + StandardCharsets.UTF_8); + LOGGER.info("Config File Contents:\n{}", configFileContents); + // Set the lastSuccessfulFingerprint to this config file's fingerprint if the config file was + // successfully activated + // Call the activationStatus Declarative Service with the activation result (true or false), + // fingerprint, and a status message + if (ConfigurationHandler.writeConfigurationToDisk(EVENT_TOPIC, configFileContents, + fingerprint) != null) { + setLastSuccessfulFingerprint(fingerprint); + cfgStatusService.activationStatus(true, fingerprint, "Activation Succeeded"); + } else { + cfgStatusService.activationStatus(false, fingerprint, "Activation Failed"); + } + } catch (IOException e) { + LOGGER.error("Cannot read config file: {}", e.getMessage(), e); + cfgStatusService.activationStatus(false, fingerprint, "Cannot read config file: " + e.getMessage()); + } + }); + } + } + + /* + * When the Persistence Service is running and available, OSGi framework will call this function passing in the + * handle to the Persistence Service. + * + * This is considered to be the start of the OSGi bundle since we are only waiting on this service before it can + * start functioning. + */ + @Reference(service = IPersistenceService.class, cardinality = ReferenceCardinality.MANDATORY, policy = ReferencePolicy.STATIC) + public synchronized void setPersistenceService(IPersistenceService serviceRef) { + PredictiveModuleActivator.service = serviceRef; + } + + /** + * If this Persistence Service shuts down, then this function will be called. Then engine will be stopped. + * + * @param service + * Persistence service instance + */ + public synchronized void unsetPersistenceService(IPersistenceService service) { + LOGGER.debug("---- PersistenceSample:PersistenceSampleActivator.unsetPersistenceService"); + if (PredictiveModuleActivator.service == service) { + PredictiveModuleActivator.service = null; + PredictiveModuleActivator.engine.stopGracefully(); + } + } + + /** + * @param arg + * remove the reference for the configuration status object + */ + void unsetConfigStatusService(IConfigStatusService arg) { + if (configStatusService == arg) { + configStatusService = null; // NOSONAR + } + } + + /** + * @return configuration status object + */ + private Optional getConfigStatusService() { + return Optional.ofNullable(configStatusService); + } + + /** + * @param arg + * inject configuration status object + */ + @Reference(service = IConfigStatusService.class, cardinality = ReferenceCardinality.MANDATORY, policy = ReferencePolicy.STATIC) + void setConfigStatusService(IConfigStatusService arg) { + configStatusService = arg; // NOSONAR + } +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/PredictValues.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/PredictValues.java new file mode 100644 index 0000000..39d4eec --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/PredictValues.java @@ -0,0 +1,283 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.custom; + +import java.io.IOException; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import java.util.concurrent.atomic.AtomicReference; + +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.fasterxml.jackson.core.JsonProcessingException; +import com.fasterxml.jackson.core.type.TypeReference; +import com.fasterxml.jackson.databind.ObjectMapper; +import com.sap.iot.edgeservices.persistenceservice.model.PSStatementObject; +import com.sap.iot.edgeservices.persistenceservice.model.QueryInputList; +import com.sap.iot.edgeservices.predictive.sample.Calculation; +import com.sap.iot.edgeservices.predictive.sample.PredictiveModuleActivator; +import com.sap.iot.edgeservices.predictive.sample.db.PersistenceClient; +import com.sap.iot.edgeservices.predictive.sample.proxies.CustomConfiguration; +import com.sap.iot.edgeservices.predictive.sample.proxies.DataCollection; +import com.sap.iot.edgeservices.predictive.sample.proxies.MultidimensionalValue; +import com.sap.iot.edgeservices.predictive.sample.utilities.DataStreamer; + +public class PredictValues +extends Calculation { + private static final String PYTHON_CMD = "python"; + private static final String PYTHON_SCRIPT_PATH = "mypythonmodule.py"; + + //////////////////// + // Static fields + //////////////////// + private static final Logger LOGGER = LoggerFactory.getLogger(PredictValues.class); // logger + private static final Boolean CLOUD_EDGE_SERVICES = true; // SET TO false for ON-PREMISE + private static final ObjectMapper objectMapper = new ObjectMapper(); // Json mapper + + //////////////////// + // Class fields + //////////////////// + private String lastConfigurationFingerprint; // fingerprint of last configuration + private CustomConfiguration configuration; // parameters configuration + private String sensorTypeAlternateId; // the sensorTypeAltnerateId that the engine will calculate on + + private QueryData queryData; // helper object to make persistence queries + private ZMQAdapter zmq; // message bus + private String mostRecentQueryTime; // last query timestamp + + //////////////////// + // constructors + //////////////////// + + public PredictValues(PersistenceClient persistenceClient) { + super(persistenceClient); + + LOGGER.debug("PredictValues:ctor - called"); + // propagate teh parameter + QueryData.setCloudEdgeServices(CLOUD_EDGE_SERVICES); + + // properties that control how the Calculation will be done + if (CLOUD_EDGE_SERVICES) { + // cloud edition + sensorTypeAlternateId = "*"; // only for this sensorType + } else { + // on-premise edition + sensorTypeAlternateId = "color"; // only for this Sensor Profile + } + + mostRecentQueryTime = queryData.resetQueryTime(); + } + + //////////////////// + // public methods + //////////////////// + + @Override + public void stopGracefully() { + LOGGER.debug("Invoked service STOP"); + // close communication + zmq.closeZmqContext(); + } + + /** + * automatically invoked each thread run + */ + @Override + public void run() { + LOGGER.debug("Invoked PredictValues thread"); + + // ensure we have initialized and in a good state + if (state != State.RUNNING) { + LOGGER.error("NOT RUNNING: PredictValues.state = {}", state); + return; + } + + // determine if any configuration changes have been sent + updateConfigurations(); + + // get the data and create a primitive array + Map> valuesByDevice = getSampleData(); + + // for each device that sent in a value, send out the max + valuesByDevice.forEach((device, values) -> { + LOGGER.debug("======================== Calculating prediction"); + + Float validPrediction = null; + // evaluate each measure + for (MultidimensionalValue rgbmap : values.getMeasures()) { + String measurement = null; + try { + measurement = objectMapper.writeValueAsString(rgbmap); + LOGGER.debug("json = {}", measurement); + } catch (JsonProcessingException e) { + LOGGER.error("Unable to parse json string due to {}", e.getMessage(), e); + } + // Send measurement + zmq.send(measurement); + String reply = zmq.receive(); + Map prediction = convertPrediction(reply); + validPrediction = checkPrediction(prediction, validPrediction, + configuration.getPredictionOutputFields()); + + // send the results back into IOT Service engine as a different capability + DataStreamer.streamResults(CLOUD_EDGE_SERVICES, configuration.getEdgePlatformRestEndpoint(), device, + configuration.getPredictionSensorTypeAlternateId(), + configuration.getPredictionCapabilityAlternateId(), configuration.getPredictionSensorAlternateId(), + prediction); + } + + // send the final result back into IOT Service engine as a different capability + DataStreamer.streamResult(CLOUD_EDGE_SERVICES, configuration.getEdgePlatformRestEndpoint(), device, + configuration.getPredictionSensorTypeAlternateId(), + configuration.getPredictionIndexCapabilityAlternateId(), configuration.getPredictionSensorAlternateId(), + validPrediction); + }); + } + + /** + * @param reply + * prediction from server + * @return converted prediction or null + */ + private Map convertPrediction(String reply) { + Map prediction = null; + TypeReference> typeRef = new TypeReference>() { + }; + + if (StringUtils.isEmpty(reply)) { + LOGGER.error("No reply from the server"); + } else { + try { + prediction = objectMapper.readValue(reply, typeRef); + } catch (IOException e) { + LOGGER.error("Unable to parse json string due to {}", e.getMessage(), e); + LOGGER.warn("The original response was:\n{}", reply); + } + } + return prediction; + } + + /** + * @param prediction + * the predicted value + * @param validPrediction + * valid prediction index + * @param outputFields + * pmml model output fields + * @return the updated index + */ + private Float checkPrediction(Map prediction, Float validPrediction, List outputFields) { + Float actualPrediction = null; + AtomicReference distance = new AtomicReference<>(0f); + if (prediction != null) { + // each output field contribute to the overall index + outputFields.forEach(field -> { + LOGGER.debug("Field: {}", field); + // define an unacceptable value + float unacceptableThreshold = 2 * outputFields.size() * configuration.getPlantColorOutOfRangeLimit(); + float val = unacceptableThreshold; + try { + val = Float.parseFloat(String.valueOf(prediction.get(field))); + // check distance out of range + } catch (Exception e) { + LOGGER.warn("Unable to get predicted value (error: {})", e.getMessage(), e); + // put an unacceptable value + distance.updateAndGet(v -> v + unacceptableThreshold); + } + + if (val > configuration.getPlantColorOutOfRangeLimit()) { + // apply a malus for the aggregated index + LOGGER.warn("value {} is out of range {}", val, configuration.getPlantColorOutOfRangeLimit()); + val *= configuration.getPlantScalingForOutOfRange(); + LOGGER.debug("value now is {}", val); + } + float finalVal = val; + // increment the index + distance.updateAndGet(v -> v + finalVal); + }); + if (validPrediction == null) { + // return the average + actualPrediction = distance.get() / outputFields.size(); + } + // was not in range + else if ((distance.get() / outputFields.size()) > configuration.getPlantColorOutOfRangeLimit() || + validPrediction > configuration.getPlantColorOutOfRangeLimit()) { + // return the worst value + LOGGER.info("value {} is out of range {}, valid value was {}", distance.get() / outputFields.size(), + configuration.getPlantColorOutOfRangeLimit(), validPrediction); + actualPrediction = (distance.get() / outputFields.size()) > validPrediction + ? (distance.get() / outputFields.size()) : validPrediction; + } else { + // average of the average + actualPrediction = ((distance.get() / outputFields.size()) + validPrediction) / 2; + } + } + return actualPrediction; + } + + //////////////////// + // private methods + //////////////////// + + // this is called by the super class, Calculation + protected void initialize() { + // init auxiliary classes + queryData = new QueryData(persistenceClient); + // first update of the configuration + updateConfigurations(); + // create broker connection + zmq = new ZMQAdapter(configuration.getBrokerConnectionAddressPort()); + boolean started = zmq.runPair(PYTHON_CMD, PYTHON_SCRIPT_PATH); + + LOGGER.debug("PredictValues:initialize - called, socket status {}", started); + // if you want to store the result to a custom table, create a table to store them. + this.state = State.RUNNING; + } + + /** + * selects the sample data from the persistence database which is constantly being updated + * + * @return data collection per each device + */ + private Map> getSampleData() { + PSStatementObject stmt; + // build sql expression to get data + String sql = queryData.getSqlForMeasureValues(); + QueryInputList args = queryData.getSqlArgsForMeasureValues(sensorTypeAlternateId, + configuration.getCapabilityAlternateId(), mostRecentQueryTime); + // update the timestamp for the next run + mostRecentQueryTime = queryData.resetQueryTime(); + + Map> valuesByDevice = new HashMap<>(); + try { + // convert measure to a structured object + stmt = persistenceClient.executeQuery(sql, args); + valuesByDevice = queryData.getValuesAsFloatMapsByDevice(stmt, sensorTypeAlternateId, + configuration.getCapabilityAlternateId()); + } catch (Exception e) { + LOGGER.error(e.getMessage(), e); + } + return valuesByDevice; + } + + // get the update of the configuration + private void updateConfigurations() { + String fingerprint = PredictiveModuleActivator.getLastSuccessfulFingerprint(); + if (configuration == null || (lastConfigurationFingerprint != null && fingerprint != null && + !fingerprint.contentEquals(lastConfigurationFingerprint))) { + configuration = PredictiveModuleActivator.getConfiguration(); + lastConfigurationFingerprint = fingerprint; + // update zmq if required + if (zmq != null) { + zmq.closeSocket(); + zmq = new ZMQAdapter(configuration.getBrokerConnectionAddressPort()); + } + } + } + +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/QueryData.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/QueryData.java new file mode 100644 index 0000000..b374eb5 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/QueryData.java @@ -0,0 +1,254 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.custom; + +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.sap.iot.edgeservices.persistenceservice.enums.QueryInputType; +import com.sap.iot.edgeservices.persistenceservice.model.PSStatementObject; +import com.sap.iot.edgeservices.persistenceservice.model.QueryInputItem; +import com.sap.iot.edgeservices.persistenceservice.model.QueryInputList; +import com.sap.iot.edgeservices.predictive.sample.PersistenceException; +import com.sap.iot.edgeservices.predictive.sample.db.PersistenceClient; +import com.sap.iot.edgeservices.predictive.sample.proxies.DataCollection; +import com.sap.iot.edgeservices.predictive.sample.proxies.MultidimensionalValue; + +public class QueryData { + private static final Logger LOGGER = LoggerFactory.getLogger(QueryData.class); // logger + private static Boolean cloudEdgeServices; // flag on premise edition + private final PersistenceClient persistenceClient; // current persistence reference + + // constructor + QueryData(PersistenceClient persistenceClient) { + this.persistenceClient = persistenceClient; + } + + // setter + static void setCloudEdgeServices(Boolean cloudEdgeServices) { + QueryData.cloudEdgeServices = cloudEdgeServices; + } + + // reset the query time for the next query + String resetQueryTime() { + String mostRecentQueryTime = null; + try { + mostRecentQueryTime = persistenceClient + .getFirstRowFirstColumn(persistenceClient.executeQuery("SELECT NOW()")); + LOGGER.debug("new date is: {}", mostRecentQueryTime); + } catch (PersistenceException e1) { + LOGGER.error("Unable to update the time: {}", e1.getMessage(), e1); + } + return mostRecentQueryTime; + } + + /** + * @return the sql expression + */ + private String getSqlForMetadata() { + // NOTE: only top 1000 records are returned. If more data is expected, then + // query should be changed to use database aggregation instead of Java + String sql = "SELECT top 1000 m.PROP_ID, m.PROP_SEQ, m.TYPE_ID FROM EFPS.MEASURE_TYPE_PROPERTY m " + + " WHERE m.OBJECT_ID = ?"; + // Add PROFILE_ID only in case of OP Edition + if (!cloudEdgeServices) { + sql += " AND m.PROFILE_ID = ?"; + } + sql += " ORDER BY m.PROP_SEQ ASC"; + + LOGGER.debug("getSqlForMetadata: ============"); + LOGGER.debug(sql); + + return sql; + } + + /** + * @param profileId + * use a particular profile for the on-premise + * @param objectId + * capabilityAlternateId + * @return the parameters for the query + */ + private QueryInputList getSqlArgsForMetadata(String profileId, String objectId) { + // create list of parameters + List items = new ArrayList<>(); + items.add(new QueryInputItem(objectId, QueryInputType.String)); + if (!cloudEdgeServices) { + items.add(new QueryInputItem(profileId, QueryInputType.String)); + } + return new QueryInputList(items); + } + + /** + * @return the sql expression + */ + String getSqlForMeasureValues() { + // NOTE: only top 1000 records are returned. If more data is expected, then + // query should be changed to use database aggregation instead of Java + String sql = "SELECT top 1000 m.DEVICE_ADDRESS, CAST(m.MEASURE_VALUE AS VARCHAR(32)) MEASURE_VALUE, " + + " m.DATE_RECEIVED FROM EFPS.MEASURE m WHERE m.OBJECT_ID = ? AND m.DATE_RECEIVED > ?"; + // Add PROFILE_ID only in case of OP Edition + if (!cloudEdgeServices) { + sql += " AND m.PROFILE_ID = ?"; + } + sql += " ORDER BY m.DATE_RECEIVED DESC"; + + LOGGER.debug("getSqlForMeasureValues: ============"); + LOGGER.debug(sql); + + return sql; + } + + /** + * @param profileId + * use a particular profile for the on-premise + * @param objectId + * capabilityAlternateId + * @param sinceDate + * date parameter for the query + * @return the parameters for the query + */ + QueryInputList getSqlArgsForMeasureValues(String profileId, String objectId, String sinceDate) { + // create list of parameters + List items = new ArrayList<>(); + items.add(new QueryInputItem(objectId, QueryInputType.String)); + items.add(new QueryInputItem(sinceDate, QueryInputType.String)); + if (!cloudEdgeServices) { + items.add(new QueryInputItem(profileId, QueryInputType.String)); + } + return new QueryInputList(items); + } + + /** + * @param statementObject + * the resultset + * @param sensorTypeAlternateId + * the sensor type alternate id + * @param capabilityAlternateId + * the capability type alternate id + * @return a collection with all the measurements around all the devices + */ + Map> getValuesAsFloatMapsByDevice(PSStatementObject statementObject, + String sensorTypeAlternateId, String capabilityAlternateId) { + List properties = this.getMetadata(sensorTypeAlternateId, capabilityAlternateId); + Map> valuesByDevice = new HashMap<>(); + + LOGGER.debug("getValuesAsDoublesByDevice start-------------"); + if (!statementObject.hasResultList()) { + // no values + LOGGER.debug("ResultSet is empty"); + return valuesByDevice; + } + // for each result convert to data + statementObject.getResultList().forEach(row -> { + // result zero is the device + String device = row.get(0).getValue().toString(); + LOGGER.debug("device = {}", device); + DataCollection valueMap = valuesByDevice.get(device); + // create aan entry in the map for each device + if (valueMap == null) { + valueMap = new DataCollection<>(); + valuesByDevice.put(device, valueMap); + } + // result one is the value + String values = row.get(1).getValue().toString(); + LOGGER.debug("value = {}", values); + LOGGER.debug("{}:{}", device, values); + + // result two is the timestamp + String date = row.get(2).getValue().toString(); + LOGGER.debug("date = {}", date); + + MultidimensionalValue mapValues = extractFloatProperties(values, properties); + // add al the properties in the measurement collection object + valueMap.add(mapValues); + LOGGER.debug("value added"); + }); + + LOGGER.debug("getValuesAsDoublesByDevice end---------------"); + return valuesByDevice; + } + + /** + * @param values + * value to be parsed + * @param properties + * properties names + * @return the collection of properties / values + */ + private MultidimensionalValue extractFloatProperties(String values, List properties) { + MultidimensionalValue mapValues = new MultidimensionalValue<>(); + // Split into properties + if (StringUtils.isEmpty(values)) { + LOGGER.debug("Empty values"); + return mapValues; + } + String[] valuesArray = values.split(" "); + for (int i = 0; i < valuesArray.length; i++) { + String prop = properties.get(i); + // extract as float + try { + Float f = Float.valueOf(valuesArray[i]); + mapValues.put(prop, f); + } catch (NumberFormatException nfe) { + LOGGER.debug("Unable to parse the value: {} due to {}", values, nfe.getMessage(), nfe); + } + } + return mapValues; + } + + /** + * @param statementObject + * the resultset + * @return a list of string that are the metadata + */ + private List getMetadataFromResultset(PSStatementObject statementObject) { + List metadata = new ArrayList<>(); + LOGGER.debug("getMetadataFromResultset start-------------"); + if (statementObject.hasResultList()) { + statementObject.getResultList().forEach(row -> { + // value zero contains metadata + String type = row.get(0).getValue().toString(); + metadata.add(type); + LOGGER.debug("type = {}", type); + }); + } else { + LOGGER.debug(" ResultSet is empty"); + } + + LOGGER.debug("getMetadataFromResultset end---------------"); + return metadata; + } + + /** + * @param sensorTypeAlternateId + * the sensor type alternate id + * @param capabilityAlternateId + * the capability type alternate id + * @return a list of string that are the metadata + */ + private List getMetadata(String sensorTypeAlternateId, String capabilityAlternateId) { + PSStatementObject stmt; + // build sql to get metadata + String sql = getSqlForMetadata(); + QueryInputList args = getSqlArgsForMetadata(sensorTypeAlternateId, capabilityAlternateId); + + List types = new ArrayList<>(); + try { + stmt = persistenceClient.executeQuery(sql, args); + // convert raw data + types = getMetadataFromResultset(stmt); + } catch (Exception e) { + LOGGER.error("Unable to get metadata due to: {}", e.getMessage(), e); + } + return types; + } +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/ZMQAdapter.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/ZMQAdapter.java new file mode 100644 index 0000000..eee009a --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/custom/ZMQAdapter.java @@ -0,0 +1,135 @@ +package com.sap.iot.edgeservices.predictive.sample.custom; + +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.zeromq.SocketType; +import org.zeromq.ZContext; +import org.zeromq.ZMQ; + +public class ZMQAdapter { + private static final Logger LOGGER = LoggerFactory.getLogger(ZMQAdapter.class); // logger + private static final String SERVER_HELLO_MESSAGE = "hello"; // server hello + private ZContext context; // zmq context + private ZMQ.Socket socket; // zmq socket + private String addressAndPort; // details for tcp connection + private Process pair; // external process reference + + // Constructor + ZMQAdapter(String addressAndPort) { + this.addressAndPort = addressAndPort; + try { + // Socket to talk to clients + context = new ZContext(); + initSocket(); + } catch (Exception e) { + LOGGER.error("Unable to initialize correctly messagebus connection due to {}", e.getMessage(), e); + } + } + + /** + * Init socket + */ + private void initSocket() { + if (context != null) { + socket = context.createSocket(SocketType.REQ); + socket.setReceiveTimeOut(10000); + socket.connect("tcp://" + addressAndPort); + } else { + LOGGER.error("No context for the socket"); + } + } + + /** + * Close entirely connection and server + */ + void closeZmqContext() { + // close existing socket + closeSocket(); + // close external process + if (pair != null) { + pair.destroyForcibly(); + } + } + + /** + * Close the socket connection + */ + void closeSocket() { + if (socket != null) { + socket.disconnect("tcp://" + addressAndPort); + socket.close(); + socket = null; + } + } + + /** + * @return received message from the socket + */ + String receive() { + String message = null; + if (socket != null) { + byte[] reply = socket.recv(0); + message = new String(reply, ZMQ.CHARSET); + LOGGER.debug("Received: [{}]", message); + } else { + LOGGER.error("Socket not initialized"); + } + return message; + } + + /** + * @param measurement + * parameter to be sent to the server + */ + void send(String measurement) { + if (!StringUtils.isEmpty(measurement) && socket != null) { + socket.send(measurement.getBytes(ZMQ.CHARSET), 0); + LOGGER.debug("Sent: [{}]", measurement); + } else { + LOGGER.error("Socket not initialized"); + } + } + + /** + * Ping the process is responding in the messagebus or start an external process + * + * @param process + * command to be sent to start the process (platform dependent) + * @param args + * arguments to start the process + * @return process started + */ + boolean runPair(String process, String args) { + boolean started = false; + try { + started = checkHello(); + } catch (Exception e) { + LOGGER.debug("Server not responding: {}", e.getMessage(), e); + try { + pair = new ProcessBuilder(process, args).start(); + // wait process started + if (pair.isAlive()) { + Thread.sleep(10000); + } + started = checkHello(); + } catch (Exception ex) { + LOGGER.error("Unable to start external process: {}", ex.getMessage(), ex); + } + } + return started; + } + + /** + * @return the response for the hello server + */ + private boolean checkHello() { + send(SERVER_HELLO_MESSAGE); + String msg = receive(); + if (msg.isEmpty() || !msg.contentEquals(SERVER_HELLO_MESSAGE)) { + return false; + } + LOGGER.debug("Server response: {}", msg); + return true; + } +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/db/PersistenceClient.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/db/PersistenceClient.java new file mode 100644 index 0000000..8bd0513 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/db/PersistenceClient.java @@ -0,0 +1,168 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.db; + +import java.util.List; + +import org.osgi.framework.BundleContext; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.sap.iot.edgeservices.persistenceservice.model.PSDataObject; +import com.sap.iot.edgeservices.persistenceservice.model.PSStatementObject; +import com.sap.iot.edgeservices.persistenceservice.model.QueryInputList; +import com.sap.iot.edgeservices.persistenceservice.service.IPersistenceService; +import com.sap.iot.edgeservices.predictive.sample.PersistenceException; + +/** + * This is a helper class that allows the Calculation classes to connect to the database. + * + */ +public class PersistenceClient { + + private static final Logger LOGGER = LoggerFactory.getLogger(PersistenceClient.class); // logger + //////////////////// + // fields + //////////////////// + + private static final String AUTH_CREDENTIALS = "G3vpZHTbKbYH^8}"; // an example password - must be securely stored + // to + // authenticate + private final IPersistenceService persistenceService; + private final BundleContext bundleContext; + private final String token; + + //////////////////// + // Constructors + //////////////////// + + public PersistenceClient(IPersistenceService persistenceService, BundleContext bundleContext) + throws PersistenceException { + this.persistenceService = persistenceService; + this.bundleContext = bundleContext; + // we attempt the token access here as if this fails, then the engine should not run + this.token = this.getPersistenceAccessToken(); + } + + //////////////////// + // public methods + //////////////////// + + /** + * This function will get the token from the persistence persistenceService + * + * @return token + * @throws PersistenceException + * if the credentials are invalid, returns type: + * PersistenceException.CODE_AUTHENTICATION_INVALID_CREDENTIALS + */ + private String getPersistenceAccessToken() + throws PersistenceException { + if (this.token == null) { + + String newToken; + LOGGER.debug("(HIDE IN PRODUCTION) bundleCanonicalName = {}", getPersistenceUsername()); + LOGGER.debug("(HIDE IN PRODUCTION) password = {}", AUTH_CREDENTIALS); + + newToken = persistenceService.RegisterBundleForAccess(getPersistenceUsername(), + AUTH_CREDENTIALS.toCharArray()); + if (newToken == null) { + throw new PersistenceException(PersistenceException.CODE_AUTHENTICATION_INVALID_CREDENTIALS); + } + return newToken; + } else { + return this.token; + } + } + + /** + * Executes DML against the persistence persistenceService DML is data modeling language, execute queries, updates, + * delete of data + * + * @param sql + * the query/update/delete to execute + * @param parameters + * the parameters for the query + * @return a statement Object of the result set or rows changed. + */ + public PSStatementObject executeQuery(String sql, QueryInputList parameters) { + return persistenceService.executeSQL(token, sql, parameters); + } + + /** + * Executes DML against the persistence persistenceService DML is data modeling language, execute queries, updates, + * delete of data. No variable parameters are allowed. + * + * @param sql + * the query/update/delete to execute + * @return a statement Object of the result set or rows changed. + */ + public PSStatementObject executeQuery(String sql) { + return persistenceService.ExecuteSQL(token, sql); + } + + /** + * Primarily for queries that will return just a single value, this function will extract that value + * + * @param statementObject + * a PSStatementObject with a result set + * @return the string value of the first column of the first row of the result set + * @throws PersistenceException + * if there is no result set then this will throw PersistenceException.CODE_SINGLETON_NOT_FOUND + */ + public String getFirstRowFirstColumn(PSStatementObject statementObject) + throws PersistenceException { + return getValue(statementObject, 0, 0); + } + + /** + * Return a specific column and row value + * + * @param statementObject + * a PSStatementObject with a result set + * @param row + * 0-based index + * @param column + * 0-based index + * @return the string value of the column and row of the result set + * @throws PersistenceException + * if there is no result set then this will throw PersistenceException.CODE_SINGLETON_NOT_FOUND + */ + private String getValue(PSStatementObject statementObject, int row, int column) + throws PersistenceException { + if (statementObject.hasResultList() && !statementObject.getResultList().isEmpty()) { + List> rows = statementObject.getResultList(); + List columns = rows.get(row); + if (!columns.isEmpty()) { + return columns.get(column).getValue().toString(); + } + } + throw new PersistenceException(PersistenceException.CODE_SINGLETON_NOT_FOUND); + } + + //////////////////// + // private methods + //////////////////// + + private String getPersistenceUsername() { + LOGGER.debug(" getPersistenceUsername: bundleContext: {}", this.bundleContext); + // adding a version number so that in development registering can be done + // without a new database since currently there is no way to delete or update + // a bundle/password combo + return getBundleCanonicalName(this.bundleContext) + ".v1"; + } + + /** + * return this bundle's name + * + * @return a string that represent the name of the bundle + */ + private String getBundleCanonicalName(BundleContext bundleContext) { + String bundleCanonicalName = bundleContext.getBundle().getSymbolicName(); + LOGGER.info("Bundle started with name: {}", bundleCanonicalName); + return bundleCanonicalName; + } + +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/ConfigurationFields.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/ConfigurationFields.java new file mode 100644 index 0000000..bbacf3b --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/ConfigurationFields.java @@ -0,0 +1,13 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.proxies; + +/** + * Configuration fields (policy service) + */ +public enum ConfigurationFields { + configFile, // NOSONAR + configFingerprint // NOSONAR +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/CustomConfiguration.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/CustomConfiguration.java new file mode 100644 index 0000000..5674164 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/CustomConfiguration.java @@ -0,0 +1,134 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.proxies; + +import java.util.List; + +import org.apache.commons.lang.StringUtils; + +public class CustomConfiguration { + private String predictionSensorTypeAlternateId; // target sensortype alternate id + private String capabilityAlternateId; // received capability alternate id + private String predictionSensorAlternateId; // target sensortype alternate id + private String predictionCapabilityAlternateId; // target capability alternate id + private String predictionIndexCapabilityAlternateId; // target index capability alternate id + private String edgePlatformRestEndpoint; // endpoint of the edge platform rest to ingest data + private Float plantColorOutOfRangeLimit; // maximum allowed distance, after you will have an outlayer + private Float plantScalingForOutOfRange; // weight for the outlayers used in the computation of the indexes + private Long analysisFrequency; // frequency to invoke the persistence client to fetch measures + private List predictionOutputFields; // pmml model file converted to an escaped string + private String brokerConnectionAddressPort; // pmml model file converted to an escaped string + private String pythonRuntimeExecutionCommand; // pmml model file converted to an escaped string + private String pythonScriptPath; // pmml model file converted to an escaped string + + // default constructor + public CustomConfiguration() { + super(); + } + + /** + * Populate the missing objects + * + * @param defaultConfiguration + * default configuration object (no null inside) + */ + public void mergeMissingValues(CustomConfiguration defaultConfiguration) { + if (StringUtils.isEmpty(predictionSensorTypeAlternateId)) { + predictionSensorTypeAlternateId = defaultConfiguration.getPredictionSensorTypeAlternateId(); + } + if (StringUtils.isEmpty(capabilityAlternateId)) { + capabilityAlternateId = defaultConfiguration.getCapabilityAlternateId(); + } + if (StringUtils.isEmpty(predictionSensorAlternateId)) { + predictionSensorAlternateId = defaultConfiguration.getPredictionSensorAlternateId(); + } + if (StringUtils.isEmpty(predictionCapabilityAlternateId)) { + predictionCapabilityAlternateId = defaultConfiguration.getPredictionCapabilityAlternateId(); + } + if (StringUtils.isEmpty(predictionIndexCapabilityAlternateId)) { + predictionIndexCapabilityAlternateId = defaultConfiguration.getPredictionIndexCapabilityAlternateId(); + } + if (StringUtils.isEmpty(edgePlatformRestEndpoint)) { + edgePlatformRestEndpoint = defaultConfiguration.getEdgePlatformRestEndpoint(); + } + if (plantColorOutOfRangeLimit == null) { + plantColorOutOfRangeLimit = defaultConfiguration.getPlantColorOutOfRangeLimit(); + } + if (plantScalingForOutOfRange == null) { + plantScalingForOutOfRange = defaultConfiguration.getPlantScalingForOutOfRange(); + } + if (analysisFrequency == null) { + analysisFrequency = defaultConfiguration.getAnalysisFrequency(); + } + if (predictionOutputFields == null || predictionOutputFields.isEmpty()) { + predictionOutputFields = defaultConfiguration.getPredictionOutputFields(); + } + if (StringUtils.isEmpty(brokerConnectionAddressPort)) { + brokerConnectionAddressPort = defaultConfiguration.getBrokerConnectionAddressPort(); + } + if (StringUtils.isEmpty(pythonRuntimeExecutionCommand)) { + pythonRuntimeExecutionCommand = defaultConfiguration.getPythonRuntimeExecutionCommand(); + } + if (StringUtils.isEmpty(pythonScriptPath)) { + pythonScriptPath = defaultConfiguration.getPythonScriptPath(); + } + } + + /** + * getters and setters + */ + + public Long getAnalysisFrequency() { + return analysisFrequency; + } + + public String getPredictionSensorTypeAlternateId() { + return predictionSensorTypeAlternateId; + } + + public String getCapabilityAlternateId() { + return capabilityAlternateId; + } + + public String getPredictionSensorAlternateId() { + return predictionSensorAlternateId; + } + + public String getPredictionCapabilityAlternateId() { + return predictionCapabilityAlternateId; + } + + public String getPredictionIndexCapabilityAlternateId() { + return predictionIndexCapabilityAlternateId; + } + + public String getEdgePlatformRestEndpoint() { + return edgePlatformRestEndpoint; + } + + public Float getPlantColorOutOfRangeLimit() { + return plantColorOutOfRangeLimit; + } + + public Float getPlantScalingForOutOfRange() { + return plantScalingForOutOfRange; + } + + public List getPredictionOutputFields() { + return predictionOutputFields; + } + + public String getBrokerConnectionAddressPort() { + return brokerConnectionAddressPort; + } + + public String getPythonRuntimeExecutionCommand() { + return pythonRuntimeExecutionCommand; + } + + public String getPythonScriptPath() { + return pythonScriptPath; + } +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/DataCollection.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/DataCollection.java new file mode 100644 index 0000000..5d87c78 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/DataCollection.java @@ -0,0 +1,46 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.proxies; + +import java.util.ArrayList; +import java.util.List; + +public class DataCollection { + private List> measures = new ArrayList<>(); // list of measures + + /** + * @param index + * measure number + * @param key + * property key + * @return value + */ + public T getMeasures(int index, String key) { + if (measures == null || measures.size() < index) { + return null; + } + return measures.get(index).getMeasure(key); + } + + /** + * @param mapValues + * put a new measurement into the list + */ + public void add(MultidimensionalValue mapValues) { + measures.add(mapValues); + } + + /** + * getters and setters + */ + public List> getMeasures() { + return measures; + } + + public void setMeasures(List> measures) { + this.measures = measures; + } + +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/MultidimensionalValue.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/MultidimensionalValue.java new file mode 100644 index 0000000..4abca75 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/proxies/MultidimensionalValue.java @@ -0,0 +1,48 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.proxies; + +import java.util.HashMap; +import java.util.Map; + +import org.apache.commons.lang.StringUtils; + +public class MultidimensionalValue { + private Map measures = new HashMap<>(); // map of property value + + /** + * @param key + * property key + * @return the value + */ + public T getMeasure(String key) { + if (StringUtils.isEmpty(key)) { + return null; + } + return measures.get(key); + } + + /** + * put a value into the list + * + * @param prop + * property + * @param f + * value + */ + public void put(String prop, T f) { + measures.put(prop, f); + } + + // getters and setters + public Map getMeasures() { + return measures; + } + + public void setMeasures(Map measures) { + this.measures = measures; + } + +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/ConfigurationHandler.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/ConfigurationHandler.java new file mode 100644 index 0000000..b19455b --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/ConfigurationHandler.java @@ -0,0 +1,190 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.utilities; + +import java.io.File; +import java.io.IOException; +import java.io.InputStream; +import java.nio.charset.Charset; +import java.nio.file.Files; +import java.nio.file.Paths; + +import org.apache.commons.io.FileUtils; +import org.apache.commons.io.IOUtils; +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.fasterxml.jackson.databind.DeserializationFeature; +import com.fasterxml.jackson.databind.ObjectMapper; +import com.sap.iot.edgeservices.predictive.sample.proxies.CustomConfiguration; + +public class ConfigurationHandler { + private static final Logger LOGGER = LoggerFactory.getLogger(ConfigurationHandler.class); // logger + private static final String BASE_PATH = "./../edgeservices/"; // base path for custom configuration, same of other + // services + private static final String UNIFORM_PATH_SEPARATOR = "/"; // linux/windows valid file separator + + private static ObjectMapper mapper = new ObjectMapper() + .configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false) + .configure(DeserializationFeature.FAIL_ON_MISSING_CREATOR_PROPERTIES, false); // json object mapper + private static String lastFingerprint = null; // last used fingerprint + + // Constructors + private ConfigurationHandler() { + super(); + } + + /** + * load an existing configuration + * + * @param serviceName + * name of current service + * @return the existing configuration (if any) + */ + public static CustomConfiguration loadConfigurationFromDisk(CustomConfiguration defaultConfiguration, + String serviceName) { + if (defaultConfiguration == null) { + defaultConfiguration = loadDefaultConfiguration(); + } + // existing file paths + String jsonFile = BASE_PATH + serviceName + "/" + serviceName + ".json"; + String fingerprintFile = BASE_PATH + serviceName + "/" + serviceName + "_fingerprint.txt"; + CustomConfiguration fromFile = null; + String content = null; + String fingerprint = null; + File path = new File(jsonFile); + if (!path.exists()) { + LOGGER.info("Configuration file does not exists: {}", jsonFile); + return null; + } + try { + byte[] contentBytes = Files.readAllBytes(Paths.get(jsonFile)); + content = new String(contentBytes, Charset.defaultCharset()); + } catch (IOException e) { + LOGGER.error("Unable to read configuration from file: {} due to {}", jsonFile, e.getMessage(), e); + } + // if there is no file there is also no needs to load the fingerprint + if (!StringUtils.isEmpty(content)) { + try { + byte[] contentBytes = Files.readAllBytes(Paths.get(fingerprintFile)); + fingerprint = new String(contentBytes, Charset.defaultCharset()); + } catch (IOException e) { + LOGGER.error("Unable to read configuration from file: {} due to {}", fingerprintFile, e.getMessage(), + e); + } + // convert to a POJO + fromFile = extractCustomConfiguration(content); + } + // populate missing values + if (fromFile != null) { + fromFile.mergeMissingValues(defaultConfiguration); + } else { + LOGGER.error("Unable to extract POJO configuration"); + return defaultConfiguration; + } + // set the fingerprint + lastFingerprint = fingerprint; + return fromFile; + } + + /** + * write a configuration into the disk + * + * @param serviceName + * the service name + * @param content + * string with the content of the configuration + * @param fingerprint + * current fingerprint + * @return the written configuration object + */ + public static CustomConfiguration writeConfigurationToDisk(String serviceName, String content, String fingerprint) { + // convert the string to a POJO + CustomConfiguration conf = extractCustomConfiguration(content); + if (conf == null) { + // configuration not valid + return null; + } + // build the path and make the dirs + String basePath = BASE_PATH + serviceName; + File path = new File(basePath); + if (!path.exists()) { + boolean created = path.mkdirs(); + if (!created) { + LOGGER.error("Unable to create the path tree: {}", basePath); + return null; + } + } + // write configuration to json file + try { + String filename = serviceName + ".json"; + File jsonFile = new File(basePath + UNIFORM_PATH_SEPARATOR + filename); + FileUtils.writeStringToFile(jsonFile, content, Charset.defaultCharset().name()); + } catch (IOException e) { + LOGGER.error("Unable to write the file: {}.json due to {}", serviceName, e.getMessage(), e); + return null; + } + // persist fingerprint + try { + String filename = serviceName + "_fingerprint.txt"; + File fingerprintFile = new File(basePath + UNIFORM_PATH_SEPARATOR + filename); + FileUtils.writeStringToFile(fingerprintFile, fingerprint, Charset.defaultCharset().name()); + } catch (IOException e) { + LOGGER.error("Unable to write the file: {}_fingerprint.txt due to {}", serviceName, e.getMessage(), e); + return null; + } + // set reference for the last fingerprint + lastFingerprint = fingerprint; + return conf; + } + + /** + * convert the configuration from string to object + * + * @param content + * json string of the configuration + * @return configuration object + */ + private static CustomConfiguration extractCustomConfiguration(String content) { + try { + return mapper.readValue(content, CustomConfiguration.class); + } catch (IOException e) { + LOGGER.error("Unable to read configuration {}", e.getMessage(), e); + } + return null; + } + + /** + * @return default configuration object + */ + public static CustomConfiguration loadDefaultConfiguration() { + // load default file from classloader + InputStream stream = ConfigurationHandler.class.getClassLoader() + .getResourceAsStream("defaultConfiguration.json"); + if (stream == null) { + LOGGER.error("No default configuration file"); + return null; + } + String content = null; + CustomConfiguration config = null; + try { + // convert the stream to a string + content = IOUtils.toString(stream, Charset.defaultCharset().name()); + } catch (IOException e) { + LOGGER.error("Unable to read configuration file {}", e.getMessage(), e); + } + // if the file is potentially valid extract the POJO + if (!StringUtils.isEmpty(content)) { + config = extractCustomConfiguration(content); + } + return config; + } + + // getters + public static String getLastFingerprint() { + return lastFingerprint; + } +} diff --git a/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/DataStreamer.java b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/DataStreamer.java new file mode 100644 index 0000000..ebaf762 --- /dev/null +++ b/predictive-python/src/main/java/com/sap/iot/edgeservices/predictive/sample/utilities/DataStreamer.java @@ -0,0 +1,134 @@ +/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * + * Copyright (c) 2020 SAP SE or an affiliate company. All rights reserved. + * The sample is not intended for production use. Provided "as is". + * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ +package com.sap.iot.edgeservices.predictive.sample.utilities; + +import java.io.OutputStream; +import java.net.HttpURLConnection; +import java.net.URL; +import java.net.URLConnection; +import java.nio.charset.StandardCharsets; +import java.util.ArrayList; +import java.util.List; +import java.util.Map; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class DataStreamer { + + private static final Logger LOGGER = LoggerFactory.getLogger(DataStreamer.class); // logger + // packet format for IOTService + private static final String IOTS_PACKET_FORMAT = "{\"sensorTypeAlternateId\":\"%s\",\"capabilityAlternateId\":\"%s\",\"sensorAlternateId\":\"%s\",\"measures\":[%s]}"; + + // constructor + private DataStreamer() { + super(); + } + + // send the results back into IOTS using a different sensorType/capability for processing again + public static void streamResults(boolean isCloudEdge, String measuresUrl, String device, + String sensorTypeAlternateId, String capabilityAlternateId, String sensorAlternateId, Map val) { + if (!isCloudEdge) { + LOGGER.debug("On-premise version is not sending data to SAP Cloud Platform Internet of Things"); + return; + } + if (val == null || val.size() == 0) { + LOGGER.error("No value to send"); + } else { + LOGGER.info("Sending data to streaming... {}{}", measuresUrl, device); + // obtain a json string + String jsonVal = mapToJsonString(val); + // format the payload + String jsonPayload = String.format(IOTS_PACKET_FORMAT, sensorTypeAlternateId, capabilityAlternateId, + sensorAlternateId, jsonVal); + + LOGGER.info("Sending data: {}", jsonPayload); + byte[] byteArrayPayload = jsonPayload.getBytes(StandardCharsets.UTF_8); + int payloadLength = byteArrayPayload.length; + + try { + // create a connection to IOTS + URL url = new URL(measuresUrl + device); + URLConnection con = url.openConnection(); + HttpURLConnection http = (HttpURLConnection) con; + + // set the properties of the post + http.setRequestMethod("POST"); // PUT is another valid option + http.setDoOutput(true); + http.setFixedLengthStreamingMode(payloadLength); + http.setRequestProperty("Content-Type", "application/json; charset=UTF-8"); + + // connect and send data + http.connect(); + try (OutputStream os = http.getOutputStream()) { + os.write(byteArrayPayload); + } + } catch (Exception e) { + LOGGER.error("Could not stream transformed results back to streaming: {}", e.getMessage(), e); + } + } + } + + // send the results back into IOTS using a different sensorType/capability for processing again + public static void streamResult(boolean isCloudEdge, String measuresUrl, String device, + String sensorTypeAlternateId, String capabilityAlternateId, String sensorAlternateId, Float val) { + if (!isCloudEdge) { + LOGGER.debug("On-premise version is not sending data to SAP Cloud Platform Internet of Things"); + return; + } + if (val == null) { + LOGGER.error("No value to send"); + } else { + LOGGER.info("Sending data to streaming... {}{}", measuresUrl, device); + + // format the payload + String jsonPayload = String.format(IOTS_PACKET_FORMAT, sensorTypeAlternateId, capabilityAlternateId, + sensorAlternateId, "[" + val + "]"); + + LOGGER.info("Sending data: {}", jsonPayload); + byte[] byteArrayPayload = jsonPayload.getBytes(StandardCharsets.UTF_8); + int payloadLength = byteArrayPayload.length; + + try { + // create a connection to IOTS + URL url = new URL(measuresUrl + device); + URLConnection con = url.openConnection(); + HttpURLConnection http = (HttpURLConnection) con; + + // set the properties of the post + http.setRequestMethod("POST"); // PUT is another valid option + http.setDoOutput(true); + http.setFixedLengthStreamingMode(payloadLength); + http.setRequestProperty("Content-Type", "application/json; charset=UTF-8"); + + // connect and send data + http.connect(); + try (OutputStream os = http.getOutputStream()) { + os.write(byteArrayPayload); + } + } catch (Exception e) { + LOGGER.error("Could not stream transformed results back to streaming: {}", e.getMessage(), e); + } + } + } + + private static String mapToJsonString(Map doubles) { + String json = "{"; + List tmpJson = new ArrayList<>(doubles.size()); + // Convert single string + for (Map.Entry val : doubles.entrySet()) { + String jsonEntry = "\""; + // escape the invalid characters and remove the unsupported chars + jsonEntry += val.getKey().replaceAll("\"", "\\\"").replaceAll("[(]", "").replaceAll("[)]", ""); + jsonEntry += "\":\""; + jsonEntry += doubles.get(val.getKey()); + jsonEntry += "\""; + tmpJson.add(jsonEntry); + } + json += String.join(",", tmpJson); + json += "}"; + return json; + } +} diff --git a/predictive-python/src/main/resources/defaultConfiguration.json b/predictive-python/src/main/resources/defaultConfiguration.json new file mode 100644 index 0000000..839b32c --- /dev/null +++ b/predictive-python/src/main/resources/defaultConfiguration.json @@ -0,0 +1,19 @@ +{ + "predictionSensorTypeAlternateId": "255", + "capabilityAlternateId": "color", + "predictionSensorAlternateId": "color sensor", + "predictionCapabilityAlternateId": "color prediction", + "predictionIndexCapabilityAlternateId": "validity color score", + "edgePlatformRestEndpoint": "http://localhost:8699/measures/", + "plantColorOutOfRangeLimit": "100", + "plantScalingForOutOfRange": "1.25", + "analysisFrequency": 10000, + "predictionOutputFields": [ + "neighbor(1)", + "neighbor(2)", + "neighbor(3)" + ], + "brokerConnectionAddressPort": "127.0.0.1:5555", + "pythonRuntimeExecutionCommand": "cmd /c python", + "pythonScriptPath": "./trainModel.py" +} \ No newline at end of file