The Benefits of Software Composition Analysis

Software Composition Analysis

It was only a few years ago when IT departments were able to manually handle vulnerabilities. Today, disruptive technologies and dynamic approaches to development have changed the digital landscape. Security is no longer the sole responsibility of the IT department. The borders of the security perimeter have been blurred, obscuring the visibility of digital spaces..

10 Tips for a Successful Cloud Migration

"Paradisiac cloud migration"

Migrating applications, data, and other workloads to the cloud is now commonplace among businesses of all sizes. A 2018 report on cloud computing trends revealed that 92 percent of companies now use public cloud services.
However, cloud migration remains a complex undertaking with several challenges, and getting it right is important. Read on to find out the challenges of cloud migration and ten tips for building a smart cloud migration plan..

Understanding Database Cloud Migration

Cloud computing is the new normal. Every time you are using a service or an application through an internet connection, you are using the cloud. Since most companies nowadays have part or all their databases hosted on the cloud, you ask how is the best way to do it. Migrating your database to the cloud can help you manage your workload by making your data available and easily scalable. Read on to learn about database cloud migration and tips to do it right..

Average time of disk dccess read/write in DB2

Through DB2 we can get the average time in ms disk access is having DB2. These times are crucial for the detection of a IO problem with DB2 instance.

Usually we take into consideration that a value close to 2-3ms is good, more than 10ms can indicate problems.

Avg ms/write:

select trunc(decimal(sum(pool_write_time))/decimal(

(sum(pool_data_writes)+sum(pool_index_writes))),3)

from sysibmadm.snaptbsp

 

Avg ms/read:

DB2 Write Suspend

When doing a snapshot from a storage array, if the server contains a DB2 instance running, there is no certainty that the snapshot contains a consistent copy of the database.

To launch a snapshot and ensure consistent copy in DB2 is possible to put the database at “write suspend”, that is, it overrides the disk access in write mode, and work in the buffer pool memory. Queries whether it will record but writes are performed only in memory.

How to know the DB2 connection port

Maybe there are other methods, in this short article, a simple way to know the port that serves DB2 server.

We get the name of the service TCP / IP:

> db2 get dbm cfg | grep SVCENAME

Capture the result:

TCP/IP Service name (SVCENAME) = db2TRP

Look at /etc/services:

> cat /etc/services | grep sapdb2QRP

db2TRP 5912/tcp # DB2 Communication Port

 

The listening port is 5912!

Using Machine Learning/AI to Boost the Supply Chain: 5 Use Cases

Industry - AI to boost the Supply ChainThis article will discuss how supply chains are being improved through the use of innovative technologies before highlighting five uses of artificial intelligence and machine learning in supply chains.

When you finish reading, you’ll understand why many industry analysts have described A.I. technologies as disruptive innovations that have the potential to alter and improve operations across entire supply chains..

Open Source for Big Data: An Overview

Software Open SourceThis article will describe the relevance of open source software and big data before describing five interesting and useful open source big data tools and projects.

Big data workloads are those that involve the processing, storage, and analysis of large amounts of unstructured data to derive business value from that data. Traditional computing approaches and data processing software weren’t powerful enough to cope with big data, which typically inundates organizational IT systems on a daily basis.

The widespread adoption of Big Data analytics workloads over the past few years has been driven, in part, by the open source model, which has made frameworks, database programs, and other tools available to use and modify for those who want to delve into these big data workloads..