Monday, 10 September 2018

Ways to Analyse/Understand Quality Assurance Practices

For the first time in my learning dump I would like to post how I did a case study in understanding Quality Assurance (QA)  Practices in my current organization. Unlike, tech giants where software engineers do both development as well as testing my current org still follows DEV & QA teams hierarchy with developer to QA ratio on an average 5:1 ( there might be N arguments for or against any developers to QA ration let me not get into that.) All I wanted or what's expected out of me is to understand the current practices that's been followed in the organization. As a pretty new member of org it was a challenging to figure out all the QA folks and get some time from them (not all though) as I could see the a slender fear factor with lots of questions when I reached out (Back to the topic).

To understand QA practices first I need to know the list of areas that I need to cover before I could have the discussion with any of the QA engineers. The following are the two areas in which I wanted to scope my discussions:

  • Testing Methods
  • Testing Strategies 
Testing Methods

 In testing methods I had concentrated on how we are doing the testing.  Initially I was in a tight corner whether should I really need to understand or even speak about Testing methods but then (after speaking with couple of folks), I got a firm belief that I should speak about the testing methodologies that's been followed in their respective projects. Areas that I covered in testing methods can be grouped into two as follows:

  • Box Approach
  • Static & Dynamic Approach

Box Approach

One might have a big question why would one be very much interested whether I'm doing a Black/White/Grey box testing. But, I wanted to understand whether QA folks has the real essence of what they have been testing. During my discussions all those who are into automation said they are into White box testing up front on further questions like "do we really go through the code that's been delivered by developers" then they have changed the mindset from White box to Grey box testing. Whereas in the case of functional test engineers they said it's black box and we are moving towards White box or automation. What I have understood from all is most of QA engineers has an assertion that when we do automation then we are doing white box testing.

Static & Dynamic Approach

There are more than few hundred tools that helps us to ensure whether we are delivering a quality product or not and each of these tools has it's own approach to identify the bugs. Some belong to static and the most belong to dynamic analysis. As we could see there is a predominant discrimination in number tools that's been available in market and I could see the same kind of  discrimination in mentality of QA engineers as well, most of them are really focused on dynamic testing it's one of the area where we could stress up on so that we could avoid a significant number of hidden bugs in the code.


Testing Strategies

After having a brief discussion on testing methods my focus to understand QA practices was to dig further to understand all the strategies that's been followed. I had some areas in my mind which seems to be must cover to understand about testing, the following are group of areas that I wish to cover in my discussions:

  • Test Levels
  • Test Coverage
  • Testing Tools
  • Security Testing
  • Load / Stress Testing
  • Risks & Mitigation
  • Test Schedule
  • Regression Approach
  • Test Status Collections & Reporting
  • Test Records
  • Requirements Traceability Matrix
  • Test Summary
Test Levels

In the test levels I wanna see how we are actually engaging with our testing in the following levels:

  • Unit Testing
  • Functional Testing
  • Integration Testing
Unit Testing

The reason why I wish to touch upon Unit Testing as most of us were practices Grey Box or Black Box testing. With no surprise all QA folks had their concentration on other levels of testing and they had a firm belief that UT's are to be handled by developers and QA does not need to engage in UT whereas, my intention was to see if QA are really aware of Code Coverage of the application through UT (Will cover more about Code Coverage in a little while).

Functional Testing

Functional testing is the major level of testing that's been concentrated by all the QA folks across the org. The method or the way in which testing is carried out has been covered in Testing Methods i.e. in the first few paragraphs of this blog.

Integration Testing

Integration testing can be viewed in two aspects the first one deals with the upstream systems and the second one deal with the downstream systems. Most of the time upstream system integration's issues or bugs are uncovered in functional testing. Whereas, in the case of downstream systems we might not be aware of the consequence of updating the existing flows. There is a common view among most of the Quality Assurance folks that it's the responsibility of downstream system team to take care of integration issues. Ideally, it's the responsibility of upstream system to make sure that there is no issues while upgrading the flows or we should inform them regarding expected outage or breakage when there is a change.

Test Coverage

Test coverage is one of the metric which has been perceived by QA folks as an area that needs to be covered by developers with unit tests. In reality the metric has to be maintained by QA team that includes dynamic coverage as well not much are really aware of how get the code coverage in dynamically and this is the area that needs a strong awareness.

Testing Tools

There is new testing tool that might be built or released by the time you are reading this blog but, how many ever tools that's been rolled out we should do a legitimate study or analysis before we pick up them in the stack of tools or technologies that's been in use already.

Security Testing

Not much of QA folks are aware of tools that help with testing security loop holes or the kind of security issues that might pop-up in the projects. I firmly believe that every one in the organization should have a certain knowledge on security testing practices.

Load Testing / Stress Testing

A Testing area which is also seen as a specialist job. And there are few so called specialist have presumption that the load generated is directly proportional to the number users that's been configured or used in the settings.

Requirement Traceability Matrix
Test Schedule
Test Records
Test Summary
Risk & Mitigations

All above areas are getting endangered as we have moved to so called agile life cycle.



Wednesday, 29 August 2018

Quick overview on Schema Change Management / Migration Tool

I have gathered or collated some information on Schema Change Migration/Management tool Liquibase please go through it.

Wednesday, 1 August 2018

Swagger 2 ASCII DOC Markup Converter

Recently I have been given a task to document my application API docs in ASCII doc format. Thanks to Swagger which helped me in generating ASCII doc as my application was built in Spring Boot. But, that doesn't stop there I need to convert the Swagger generated API doc to ASCII.

By the time when I wanted to generate my API doc my application already had about 30+ paths each having it's own CRUD REST operations leading to a couple more than 100 REST endpoints. That was a nightmare for me to convert all the JSON format API doc to ASCII format.

I was thinking should I quickly write a code so that it converts my JSON to ASCII doc or should I handle it some other means. Before, I started with anything like other problems I thought for a while is this problem is specific to myself or someone else had the same before.

As usual, this problem is generic most of us had been faced this issue. So, I was looking for the best way on how this issue is solved by others after a while, I came to know about swagger2markup where the main objective is to covert Swagger doc into ASCII doc and it's as simple as to use.

The following is the snippet chisel that broke the iceberg which stood in front of me:

//local file where I stored my swagger API doc
Path localSwaggerFile = Paths.get("swagger.json");
//the dir in which I need to store the ASCII DOC
Path outputDirectory = Paths.get("build/asciidoc");
 
//Magic wand that did all the tricks in no time :) 
Swagger2MarkupConverter.from(localSwaggerFile)
        .build()
        .toFolder(outputDirectory);


Tuesday, 24 April 2018

Gathering MetaData of A Table through the JDBC

When we are dealing with the ORM we don't even turn our eyes towards the table meta-data. But, on one fine day you would be waking up just to handle the tables through the meta-data. The tables can be accessed only through the JDBC interfaces and no other layer is crafted for you to do CURD operations. Yes, I faced the same scenario and the following are the few snippets that really helped me to gather meta-data from the DB schema.

The following are few aspects of Meta-data on which I'm interested in:
  • Table organization
    • Column Name
    • Data type of a column
  • Constraints 
    • Non Null-able constraints 
    • Check Constraints 
  • Primary Key
  • Child Tables Meta-data
I had the following POJO that was holding the data's that I required:

public class ColumnMetaData {

    private String columnName;
    private String dataType;
    private boolean nullable;
    private boolean autoIncrement;
}





public class TableMetaData {

    private String tableName;
    private Map<String, ColumnMetaData> columns;
    private String primaryKey;
    private boolean nonIDPrimaryKey;
    private Set<String> nonNullableColumns;
    private Map<String, ChildTableMetaData> childTables;
}

And the following are classes that I have used from java.sql package:

private Connection connection;
private DatabaseMetaData metadata;

above objects are set accordingly:

connection = jdbcTemplate.getDataSource().getConnection();
metadata = connection.getMetaData();

I would be running through the code snippets that helped me to collect the data on which I was interested.

Table Organization & Nullable Constraints: 

ResultSet columnsMetaData = metadata.getColumns(null, "VIVEK", "DEMO", null); **
 
while (columnsMetaData.next()) {

    ColumnMetaData metaData = new ColumnMetaData();
    String columnName = columnsMetaData.getString("COLUMN_NAME");
    metaData.setColumnName(columnName);
    metaData.setDataType(columnsMetaData.getString("DATA_TYPE"));
    metaData.setNullable(columnsMetaData.getBoolean("NULLABLE"));
    //nullableColumns are processed / used in 
TableMetaData     
    if (!metaData.isNullable()) {
        nullableColumns.add(metaData.getColumnName());
    }
}

Since, I'm aware of what data that I wanted to read form the ResultSet I inferred those data directly with getString/getBoolean. Probably you need to get the metadata of resultSet if you are interested in some thing else. 

Primary Key :

ResultSet tablePrimaryKey = metadata.getPrimaryKeys(null, "VIVEK", "DEMO"); **

while (tablePrimaryKey.next()) {

    primaryKey = tablePrimaryKey.getString("COLUMN_NAME");
    log.debug("{} is primary key for the table {}", primaryKey, table);

    //as we don't support composite columns for primary    break;
}


Child Table MetaData:
 
ResultSet exportedKeys = metadata.getExportedKeys(null, "VIVEK", "DEMO"); **

Map<String, ChildTableMetaData> childTablesMetaData = new HashMap<>();
while (exportedKeys.next()) {

    ChildTableMetaData childTableMetaData = new ChildTableMetaData();
    String childTableName = exportedKeys.getString("FKTABLE_NAME");
    childTableMetaData.setTableName(childTableName);
    childTableMetaData.setFkColumnName(exportedKeys.getString("FKCOLUMN_NAME"));
    childTableMetaData.setPkColumnName(exportedKeys.getString("PKCOLUMN_NAME"));
    childTablesMetaData.put(childTableName, childTableMetaData);
} 


** in this snippet "VIVEK" is the schema that I'm connecting and "DEMO" is the table name for
which I'm collecting the data.

Friday, 20 April 2018

JDBC ResultSet to JSON transformation.


With a bunch ORM frameworks(especially for JVM languages) out there and each one of us sticking to our favorite ORM in the applications that we develop and when there is a necessary or need to handle the data in JDBC level even small stuff like converting ResultSet to Json seems to be a complex task. Here I'll be giving a gist on how to convert ResultSet to JSON object.

While querying the data through JDBC we either look for one tuple or a list of tuple (i.e. one or N rows).  In other words, We either query for Map (key representing column name and value representing the actual value in the table) or a List of Map. In technical terms we would invoke queryForList or queryForMap. The ResultSet can then be transformed to a Map from which we can easily transform to JSON object.

The following is the code I have implemented to convert the ResultSet List to a JSON:

List<Map<String, Object>> mapperList = new ArrayList<Map<String, Object>>();
List<Map<String, Object>> transformObject = (List) resultSet;

// Result Set might be empty so validate it before processing 
if (transformObject.size() < 1) {
    log.warn("No Results found");
    throw new NoEntityFoundException("Data not found");
}

//Iterate through each row in resultSet (Basically a Map) 
transformObject.forEach(result -> {

    Map<String, Object> transformMap = new HashMap<String, Object>();
    transformData(result, transformMap);
    mapperList.add(transformMap);
});
//print the transformed data 
System.out.println(mapper.writeValueAsString(mapperList)); 


The following is the Object Mapper configuration:

ObjectMapper objectMapper = new ObjectMapper();
//as we don't need to send NULL values in the JSON response
objectMapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
objectMapper.configure(SerializationFeature.WRITE_NULL_MAP_VALUES, false);