8/27/2019 Unit Testing Template For Etl
Before we learn anything about ETL Testing its important to learn about Business Intelligence and Dataware. Let’s get started – What is BI?
Business Intelligence is the process of collecting raw data or business data and turning it into information that is useful and more meaningful. The raw data is the records of the daily transaction of an organization such as interactions with customers, administration of finance, and management of employee and so on. These data’s will be used for “Reporting, Analysis, Data mining, Data quality and Interpretation, Predictive Analysis”. What is Data Warehouse? A data warehouse is a database that is designed for query and analysis rather than for transaction processing.
The data warehouse is constructed by integrating the data from multiple heterogeneous sources.It enables the company or organization to consolidate data from several sources and separates analysis workload from transaction workload. Data is turned into high quality information to meet all enterprise reporting requirements for all levels of users. ETL stands for Extract-Transform-Load and it is a process of how data is loaded from the source system to the data warehouse. Data is extracted from an OLTP database, transformed to match the data warehouse schema and loaded into the data warehouse database. Many data warehouses also incorporate data from non-OLTP systems such as text files, legacy systems and spreadsheets. Let see how it works For example, there is a retail store which has different departments like sales, marketing, logistics etc.
Each of them is handling the customer information independently, and the way they store that data is quite different. The sales department have stored it by customer’s name, while marketing department by customer id.
Now if they want to check the history of the customer and want to know what the different products he/she bought owing to different marketing campaigns; it would be very tedious. The solution is to use a Datawarehouse to store information from different sources in a uniform structure using ETL.
ETL can transform dissimilar data sets into an unified structure.Later use BI tools to derive meaningful insights and reports from this data. The following diagram gives you the ROAD MAP of the ETL process. Extract. Extract relevant data. Transform. Transform data to DW (Data Warehouse) format. Build keys - A key is one or more data attributes that uniquely identify an entity.
Various types of keys are primary key, alternate key, foreign key, composite key, surrogate key. The datawarehouse owns these keys and never allows any other entity to assign them.
Cleansing of data:After the data is extracted, it will move into the next phase, of cleaning and conforming of data. Cleaning does the omission in the data as well as identifying and fixing the errors. Conforming means resolving the conflicts between those data’s that is incompatible, so that they can be used in an enterprise data warehouse.
In addition to these, this system creates meta-data that is used to diagnose source system problems and improves data quality. Load. Load data into DW ( Data Warehouse). Build aggregates - Creating an aggregate is summarizing and storing data which is available in fact table in order to improve the performance of end-user queries. What is ETL Testing? ETL testing is done to ensure that the data that has been loaded from a source to the destination after business transformation is accurate.
It also involves the verification of data at various middle stages that are being used between source and destination. ETL stands for Extract-Transform-Load. ETL Testing Process Similar to other Testing Process, ETL also go through different phases. The different phases of ETL testing process is as follows ETL testing is performed in five stages.
Identifying data sources and requirements. Data acquisition.
Implement business logics and dimensional Modelling. Build and populate data. Build Reports Types of ETL Testing Types Of Testing Testing Process Production Validation Testing “Table balancing” or “production reconciliation” this type of ETL testing is done on data as it is being moved into production systems. To support your business decision, the data in your production systems has to be in the correct order. Data Validation Option provides the ETL testing automation and management capabilities to ensure that production systems are not compromised by the data. Source to Target Testing (Validation Testing) Such type of testing is carried out to validate whether the data values transformed are the expected data values. Application Upgrades Such type of ETL testing can be automatically generated, saving substantial test development time.
This type of testing checks whether the data extracted from an older application or repository are exactly same as the data in a repository or new application. Metadata Testing Metadata testing includes testing of data type check, data length check and index/constraint check.
Data Completeness Testing To verify that all the expected data is loaded in target from the source, data completeness testing is done. Some of the tests that can be run are compare and validate counts, aggregates and actual data between the source and target for columns with simple transformation or no transformation. Data Accuracy Testing This testing is done to ensure that the data is accurately loaded and transformed as expected. Data Transformation Testing Testing data transformation is done as in many cases it cannot be achieved by writing one sourcequery and comparing the output with the target. Multiple SQL queries may need to be run for each row to verify the transformation rules.
Data Quality Testing Data Quality Tests includes syntax and reference tests. In order to avoid any error due to date or order number during business process Data Quality testing is done. Syntax Tests: It will report dirty data, based on invalid characters, character pattern, incorrect upper or lower case order etc. Reference Tests: It will check the data according to the data model. For example: Customer ID Data quality testing includes number check, date check, precision check, data check, null check etc. Incremental ETL testing This testing is done to check the data integrity of old and new data with the addition of new data.
Incremental testing verifies that the inserts and updates are getting processed as expected during incremental ETL process. GUI/Navigation Testing This testing is done to check the navigation or GUI aspects of the front end reports.
How to create ETL Test Case ETL testing is a concept which can be applied to different tools and databases in information management industry. The objective of ETL testing is to assure that the data that has been loaded from a source to destination after business transformation is accurate. It also involves the verification of data at various middle stages that are being used between source and destination. While performing ETL testing, two documents that will always be used by an ETL tester are. ETL mapping sheets:An ETL mapping sheets contain all the information of source and destination tables including each and every column and their look-up in reference tables. An ETL testers need to be comfortable with SQL queries as ETL testing may involve writing big queries with multiple joins to validate data at any stage of ETL.
ETL mapping sheets provide a significant help while writing queries for data verification. DB Schema of Source, Target: It should be kept handy to verify any detail in mapping sheets. ETL Test Scenarios and Test Cases. Test Scenario Test Cases Mapping doc validation Verify mapping doc whether corresponding ETL information is provided or not. Change log should maintain in every mapping doc. Validation. Validate the source and target table structure against corresponding mapping doc.
Source data type and target data type should be same. Length of data types in both source and target should be equal. Verify that data field types and formats are specified. Source data type length should not less than the target data type length. Validate the name of columns in the table against mapping doc. Constraint Validation Ensure the constraints are defined for specific table as expected Data consistency issues.
The data type and length for a particular attribute may vary in files or tables though the semantic definition is the same. Misuse of integrity constraints Completeness Issues. Ensure that all expected data is loaded into target table.
Compare record counts between source and target. Automation of ETL Testing The general methodology of ETL testing is to use SQL scripting or do “eyeballing” of data. These approaches to ETL testing are time-consuming, error-prone and seldom provide complete test coverage. To accelerate, improve coverage, reduce costs, improvedetection ration of ETL testing in production and development environments, automation is the need of the hour. One such tool is Informatica.
A goodtemplate maintains test artifact consistency for the test team and makes it easy for all stakeholders to understand the test cases. Writing test case in a standard format lessen the test effort and the error rate. Test cases format are more desirable in case if you are reviewing test case from experts. The template chosen for your project depends on your test policy. Many organizations create test cases in Microsoft Excel while some in Microsoft Word. Some even use test management tools like HP ALM to document their test cases. Irrespective of the test case documentation method chosen, any good test case template must have the following fields.
It is useful while executing the test. Low. Medium. High Name of the Module:. Determine the name of the main module or sub-module being tested Test Designed by:.
Tester's Name Date of test designed:. Date when test was designed Test Executed by:. Who executed the test- tester Date of the Test Execution:.
Date when test needs to be executed Name or Test Title:. Title of the test case Description/Summary of Test:.
Determine the summary or test purpose in brief Pre-condition:. Any requirement that needs to be done before execution of this test case. To execute this test case list all pre-conditions Dependencies:.
Determine any dependencies on test requirements or other test cases Test Steps:. Mention all the test steps in detail and write in the order in which it requires to be executed. While writing test steps ensure that you provide as much detail as you can Test Data:. Use of test data as an input for the test case. Deliver different data sets with precise values to be used as an input Expected Results:.
Mention the expected result including error or message that should appear on screen Post-Condition:. What would be the state of the system after running the test case?
Actual Result:. After test execution, actual test result should be filled Status (Fail/Pass):. Mark this field as failed, if actual result is not as per the estimated result Notes:. If there are some special condition which is left in above field Optionally you can have the following fields depending on the project requirements. Link / Defect ID: Include the link foror determine the defect number if test status is fail. Keywords / Test Type: To determine tests based on test types this field can be used. Eg: Usability, functional, business rules, etc.
Requirements: Requirements for which this test case is being written. References / Attachments: It is useful for complicated test scenarios, give the actual path of the document or diagram. Automation ( Yes/No): To track automation status when test cases are automated.
Almost all the IT companies today, highly depend on data flow as a large amount of information is made available for access and one can get everything which is required. And this is where the concept of ETL and ETL Testing comes into the picture. Basically, ETL is abbreviated as Extraction, Transformation, and Loading. Presently ETL Testing is performed using SQL scripting or using spreadsheets which may be a time-consuming and error-prone approach.
In this article, we will have detailed discussions on several concepts viz. ETL, ETL Process, ETL testing and different approaches used for it along with the most popular ETL testing tools. What You Will Learn:.
ETL Testing Concepts #1) As mentioned previously ETL stands for Extraction, Transformation, and Loading is considered to be the three prime database functions. Extraction: Reading data from the database. Transformation: Converting the extracted data into the required form to store into another database. Loading: Writing the data into the target database. #2)ETL is used to transfer or migrate the data from one database to another, to prepare data marts or data warehouses. Following diagram elaborates the ETL Process in a precise way: ETL Testing Process ETL Testing Process is similar to other testing processes and includes some stages. They are:.
Identifying business requirements. Test Planning. Designing test cases and test data. Test execution and bug reporting. Summarizing reports. Test closure Types of ETL Testing ETL Testing can be classified into the following categories according to the testing process that is been followed.
#1) Production Validation Testing: It is also called as Table balancing or product reconciliation. It is performed on data before or while being moved into the production system in the correct order. #2) Source To Target Testing: This type of ETL Testing is performed to validate the data values after data transformation.
#3) Application Upgrade: It is used to check whether the data is extracted from an older application or new application or repository. #4) Data Transformation Testing: Multiple SQL queries are required to be run for each and every row to verify data transformation standards.
#5) Data Completeness Testing: This type of testing is performed to verify if the expected data is loaded at the appropriate destination as per the predefined standards. I would also like to compare ETL Testing with Database Testing but before that let us have a look at the types of ETL Testing with respect to database testing. Given below are the Types of ETL Testing with respect to Database Testing: 1) Constraint Testing: Testers should test whether the data is mapped accurately from source to destination, while checking for it testers need to focus on some key checks (constraints). They are:.
NOT NULL. UNIQUE. Primary Key. Foreign Key. Check.
NULL. Default 2) Duplicate Check Testing: Source and target tables contain a huge amount of data with frequently repeated values, in such case testers follow some database queries to find such duplication. 3) Navigation Testing: Navigation concerns with the GUI of an application. User finds an application friendly when he gets easy and relevant navigation throughout the entire system. The tester must focus on avoiding irrelevant navigation from the user point of view.
4) Initialization Testing: Initialization Testing is performed to check combination of hardware and software requirements along with platform it is installed 5) Attribute Check Testing: This testing is performed for verifying if all the attributes of both the source and target system are the same From the above listing one may consider that ETL Testing is quite similar to Database Testing but the fact is ETL Testing is concerned with Data Warehouse Testing and not Database Testing. There are several other facts due to which ETL Testing differs from Database Testing. Let’s have a quick look at what they are:. The primary goal of Database Testing is to check if the data follows the rules and standards of the data model where on the other hand ETL Testing checks if data is moved or mapped as expected. Database Testing focuses on maintaining a primary key-foreign key relationship while ETL Testing verifies for data transformation as per the requirement or expectation and are same at source and target system. Database Testing recognizes missing data whereas ETL Testing determines duplicate data. Database Testing is used for data integration and ETL Testing for enterprise business intelligence reporting.
These are some major differences which make ETL Testing different from Database Testing. Given below is the table showing the list of ETL Bugs: Type of bug Description Calculation Bugs Final output wrong due to mathematical error Input/output Bugs Accepts invalid values and rejects valid values H/W bugs Device is not responding due to hardware issues User Interface bugs Related to GUI of an application Load condition bugs Denies multiple users How to Create Test Cases in ETL Testing The primary goal of ETL testing is to assure whether the extracted and transformed data is loaded accurately from source to the destination system. ETL testing includes two documents, they are: #1) ETL Mapping Sheets: This document contains information about the source & destination tables and their references. Mapping sheet provides help to create big SQL queries while performing ETL Testing.
#2) Database schema for Source and Destination table: It should be kept updated in the mapping sheet with database schema to perform data validation. List of Most Popular ETL Testing Tools Like automation testing, ETL Testing can be also automated. Automated ETL Testing reduces time consumption during the testing process and helps to maintain accuracy. Few ETL Testing Automation Tools are used to perform ETL Testing more effectively and rapidly. Given below is the list of ETL Testing Tools:. Informatica Data Validation.
QuerySurge. ICEDQ.
Datagaps ETL Validator. QualiDI.
Talend Open Studio for Data Integration. Codoid’s ETL Testing Services. Data Centric Testing.
SSISTester. TestBench. GTL QAceGen.
Zuzena Automated Testing Service. DbFit. AnyDbTest. 99 Percentage ETL Testing. =. #1) Informatica Data Validation ( Note: Click on any image for an enlarged view) image Informatica Data Validation is a GUI based ETL Testing tool which is used to extract, transform and Load (ETL). The testing includes a comparison of tables before and after data migration.
This type of testing ensures data integrity, i.e. The volume of data is correctly loaded and is in expected format into the destination system. Key Features:. Informatica Validation tool is a comprehensive ETL Testing tool which does not require any programming skill. It provides automation during ETL testing which ensures if the data is delivered correctly and is in the expected format into the destination system. It helps to complete data validation and reconciliation in the testing and production environment. It reduces the risk of introducing errors during transformation and avoid bad data to be transformed into the destination system.
Informatica Data Validation is useful in Development, Testing and Production environment where it is necessary to validate the data integrity before moving into the production system. 50 to 90% of cost and efforts can be saved using Informatica Data Validation tool. Informatica Data Validation provides a complete solution for data validation along with data integrity. Reduces programming efforts and business risks due to an intuitive user interface and built-in operators. Identifies and prevents data quality issues and provides greater business productivity. Allows 64% free trial and 36% paid service that reduces time and cost required for data validation.
Visit official site here: #2) QuerySurge image QuerySurge tool is specifically built for testing of Big Data and Data warehouse. It ensures that the data extracted and loaded from source system to the destination system is correct and is as per the expected format. Any issues or differences are identified very quickly by QuerySurge.
Key Features:. QuerySurge is an automated tool for Big Data Testing and ETL Testing. It improves the data quality and accelerates testing cycles. It validates data using Query Wizard. It saves time & cost by automating manual efforts and schedules tests for a specific time. QuerySurge supports ETL Testing across the various platform like IBM, Oracle, Microsoft, SAP.
It helps to build test scenarios and test suit along with configurable reports without specific knowledge of SQL. It generates email reports through an automated process. Reusable query snippet to generate reusable code.
It provides a collaborative view of data health. QuerySurge can be integrated with HP ALM, TFS, IBM Rational Quality Manager. Verifies, converts, and upgrades data through the ETL process. It is a commercial tool that connects source and target data and also supports real-time progress of test scenarios. Visit official site here: #3) iCEDQ image iCEDQ is an automated ETL Testing tool specifically designed for the issues faced in a data-centric project like a data warehouse, data migration etc. ICEDQ performs verification, validation, and reconciliation between the source and destination system. It ensures if the data is intact after migration and it avoids bad data to load into the target system.
Etl Documentation Template
Key Features:. iCEDQ is a unique ETL Testing tool which compares millions of rows of databases or files. It helps to identify the exact row and column which contains data issue. It sends alerts and notifications to the subscribed users after execution. It supports regression testing. iCEDQ supports various databases and can read data from any database. iCEDQ connects with a relational database, any JDBC compliant database, flat files etc.
Based on unique columns in the database, iCEDQ compares the data in memory. It can be integrated with HP ALM – Test Management Tool. iCEDQ is designed for ETL Testing, Data Migration Testing and Data Quality Verification. Identifies data integration errors without any custom code. Supports rule engine for ETL process, collaborative efforts and organized QA process. It is a commercial tool with 30 days trial and provides custom reports with alerts and notifications.
Visit official site here: #4) Datagaps ETL Validator. image ETL Validator tool is designed for ETL Testing and Big Data Testing.
It is a solution for the data integration projects. The testing of such data integration project includes various data types, huge volume, and various source platforms. ETL Validator helps to overcome such challenges using automation which further helps to reduce the cost and to minimize efforts. ETL Validator has an inbuilt ETL engine which compares millions of records from various databases or flat files. ETL Validator is data testing tool specifically designed for automated data warehouse testing. Visual Test Case Builder with drag and drop capability.
ETL Validator has features of Query Builder which writes the test cases without manually typing any queries. Compare aggregate data such as count, sum, distinct count etc.
Simplifies the comparison of database schema across various environment which includes data type, index, length, etc. ETL Validator supports various platforms such as Hadoop, XML, Flat files etc. It supports email notification, web reporting etc. It can be integrated with HP ALM which results in sharing of test results across various platforms. ETL Validator is used to check Data Validity, Data Accuracy and also to perform Metadata Testing.
Checks Referential Integrity, Data Integrity, Data Completeness and Data Transformation. It is a commercial tool with 30 days trial and requires zero custom programming and improves business productivity. Visit official site here: #5) QualiDI image QualiDi is an automated testing platform which offers end to end testing and ETL Testing. It automates ETL Testing and improves the effectiveness of ETL Testing.
It also reduces testing cycle and improves the data quality. QualiDI identifies bad data and non-compliant data very easily. QualiDI reduces regression cycle and data validation.
Key Features:. QualiDI creates automated test cases and it also provides support for automated data comparison. It offers data traceability and test case traceability. It has a centralized repository for requirements, test cases, and test results. It can be integrated with HPQC, Hadoop etc. QualiDI identifies a defect in the early stage which in turns reduces the cost.
It supports email notifications. It supports continuous integration process. It supports Agile development and rapid delivery of sprints. QualiDI manages complex BI Testing cycle, eliminates human error and data quality maintained. Visit official site: #6) Talend Open Studio for Data Integration image Talend Open Studio for Data Integration is an open source tool which makes ETL Testing easier. It includes all ETL Testing functionality and additional continuous delivery mechanism. With the help of Talend Data Integration tool, a user can run the ETL jobs on the remote servers that too with a variety of operating system.
ETL Testing ensures that data is transformed from the source system to the target without any data loss and thereby adhering to transformation rules. Key Features:.
Talend Data Integration supports any types of relational database, Flat files etc. Integrated GUI which simplifies design and developing of ETL processes. Talend Data Integration has inbuilt data connectors with more than 900 components. It detects business ambiguity and inconsistency in transformation rule quickly.
It supports remote job execution. Identifies defects at an early stage to reduce cost. It provides quantitative and qualitative metrics based on the ETL best practices. Context switching is possible between. ETL development, ETL testing, and ETL production environment. Real-time data flow tracking along with the detailed execution statistics. Visit official site here: #7) Codoid’s ETL Testing Services image Codoid’s ETL and data warehouse testing service includes data migration and data validation from source to the target system.
ETL Testing ensures that there is no data error, no bad data or data loss while loading data from source to the target system. It quickly identifies any data errors or any other general errors occurred during the ETL process. Key Features:. Codoid’s ETL Testing service ensures data quality in the data warehouse and data completeness validation from the source to the target system. ETL Testing and data validation ensure that the business information transformed from source to target system is accurate and reliable.
The automated testing process performs data validation during and post data migration and prevents any data corruption. Data validation includes count, aggregates and spot checks between the target and actual data.
Automated testing process verifies if data type, data length, indexes are accurately transformed and loaded into the target system. Data quality Testing prevents data errors, bad data or any syntax issues. Visit official site here: #8) Data-Centric Testing Data Centric testing tool performs robust data validation to avoid any glitches such as data loss or data inconsistency during data transformation. It compares data between systems and ensures that the data loaded into target system is exactly matching with the source system in terms of data volume, data type, format, etc. Key Features:. Data Centric Testing is build to perform ETL Testing and Data warehouse testing.
Data Centric Testing is the largest and the oldest testing practice. It offers ETL Testing, data migration and reconciliation. It supports various relational database, Flat files etc. Efficient Data validation with 100% data coverage. Data Centric Testing also support comprehensive reporting. The automated process of data validation generates SQL queries which result in the reduction of cost and efforts.
It offers a comparison between heterogeneous databases like Oracle & SQL Server and ensures that the data in both systems is in the correct format. Visit official site here: #9) SSISTester image SSISTester is a framework which helps in unit and integration testing of SSIS packages. It also helps to create ETL processes in a test-driven environment which thereby helps to identify errors in the development process. There are a number of packages created while implementing ETL processes and these needs to be tested during unit testing.
Integration test is also “Live tests”. Key Features:. Unit test creates and verify tests and once execution gets complete it performs clean-up job.
Integration test verifies that all packages are satisfied post execution of the unit test. Tests are created in a simple way as the user creates it in Visual Studio. Real-time debugging of a test is possible using SSISTester. Monitoring of test execution with user-friendly GUI. Test results are exported in HTML format. It removes external dependencies by using fake source and destination addresses.
For the creation of tests, it supports any.NET language. Visit official site here: #10) TestBench TestBench is a database management and verification tool. It is a unique solution which addresses all issues related to the database. User managed data rollback improve testing productivity and accuracy. It also helps to reduce environment downtime. TestBench reports all inserted, updated and deleted transactions which are performed in a test environment and captures the status of the data before and after the transaction.
Key Features:. It always maintains data confidentiality to protect data. It has restoration point for an application when a user wants to return back to a specific point. It improves decision making knowledge. It customizes data sets to improve test efficiency. It helps for maximum tests coverage and helps to reduce time and money.
Data privacy rule ensures that the live data is not available in the test environment. Results are compared with various databases.
Results include differences in tables & operation performed on tables. TestBench analyzes the relationship between the tables and maintains the referential integrity between tables. Visit official site here: Some more to the list: #11) GTL QAceGen QAceGen is specifically designed to generate complex test data, automate ETL regression suite and to validate business logic of applications. QAceGen generates test data based on the business rule which is defined in the ETL specification. It creates each scenario which includes data generation and data validation statement.
Visit official site here: #12) Zuzena Automated Testing Service Zuzena is an automated testing service developed for data warehouse testing. It is used to execute large projects such as data warehousing, business intelligence and it manages data and executes integration and regression test suite.
It automatically manages ETL execution and result evaluation. It has a wide range of metrics which monitors QA objectives and team performance. Visit official site: #13) DbFit DbFit is an open source testing tool which is released under GPL license. It writes unit and integration tests for any database code. These tests are easy to maintain and can be executed directly from the browser. These tests are written using tables and are executed using the command line or Java IDE. It supports major databases like Oracle, MySQL, DB2, SQL Server, PostgreSQL, etc.
Visit official site here: #14) AnyDbTest AnyDbTest is an automated unit testing tool specifically designed for DBA or database developer. AnyDbTest writes test cases with XML and allows using excel spreadsheet as a source of the test case.
Standard assertions are supported such as SetEqual, StrictEqual, IsSupersetOf, RecordCountEqual, Overlaps etc. It supports various types of databases like MySQL, Oracle, SQL Server, etc. Testing can include more than one database i.e. Source database can be an Oracle server and target database in which data needs to be loaded can be SQL Server. Visit official site here: #15) 99 Percentage ETL Testing 99 Percentage ETL Testing ensures data integrity and production reconciliation for any database system. It maintains the ETL mapping sheet and validates source and target database mapping of rows and columns.
It also maintains the DB Schema of the source and target database. It supports production validation testing, data completeness, and data transformation testing. Visit official site here: Points to Remember While performing ETL testing several factors are to be kept in mind by the testers.
Some of them are listed below:. Apply suitable business transformation logic.
Execute backend data-driven tests. Create and execute absolute test cases, test plans, and test harness. Assure accuracy of data transformation, scalability and performance.
Make sure E. TL application reports invalid values.
Unit tests should be created as targeted standards. Conclusion ETL Testing is not only a tester’s duty but it also involves developers, business analyst, database administrators (DBA) and even the users.
ETL Testing process became vital as it is required to make strategic decisions at regular time intervals. ETL Testing is being considered as Enterprise Testing as it requires a good knowledge of SDLC, SQL queries, ETL procedures etc. =.
Let us know if we have missed out any tool on to the above list and also suggest the ones that you use for ETL Testing in your daily routine.
I am quite new to platformio and trying to start my own library, but have run into a problem. The library is available here: It depends on the Embedded Template Library (ETL) and I am trying to setup unit testing.
To run my test I tried: platformio ci -board=nodemcuv2 -lib='.' Test/testmain.cpp But I get an error that seems to be ETL related. You need to create platformio.ini file in the root of your repo; PlatformIO Project Configuration File;; Build options: build flags, source filter; Upload options: custom upload port, speed and extra flags; Library options: dependencies, extra library storages; Advanced options: extra scripting;; Please visit documentation for the other options and examples; env:nodemcuv2 platform = espressif8266 board = nodemcuv2 framework = arduino libdeps = Embedded Template Library env:native platform = native libdeps = Embedded Template Library Then run pio test in the project dir.
Comments are closed.
|
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |