Infosec Train’s SOC Analysis Training Program

A security operation center is a place in an organization that houses Cybersecurity professionals who analyze, monitor, design, and manage an organization’s security posture. SOC teams are made up of security managers, security engineers, SOC analysts, and SOC analysts. This team is responsible for protecting the organization’s infrastructure from cyberthreats. .

Recent reports have revealed an increase in cyberattacks and data breaches. These incidents have caused significant financial and reputational damage for businesses. Organizations require human experts to monitor security infrastructure and identify ongoing or potential security threats.
This section will cover everything you need to know about Infosec Train’s new SOC Analyst training program. Before we move on, let’s first understand the job description of the SOC Analyst.
What is a SOC analyst?
SOC Analysts are the first line defenders. They monitor the systems and respond to security incidents. There are three levels for SOC analysts: Tier 1, Tier 2 and Tier 3.
Tier 1: Tier 1 or L1 SOC Analysts are triage specialists. The Tier 1 SOC analysts monitor the incidents and determine their severity and urgency. They raise trouble tickets to alert the Tier 2 SOC analyst and manage the security tools.
Tier 2: These security analysts respond to security threats. They are responsible for identifying infected systems and determining the extent of the attack. They are responsible for collecting data and formulating remediation and recovery strategies.
Tier 3: Tier 3 or L3 SOC analysts review vulnerability assessment reports. They use advanced threat intelligence techniques to identify security threats in the system. They can also conduct penetration tests to identify vulnerable endpoints within the network.
Infosec Train’s SOC Analysis training program
InfoSec Train’s SOC Analyst training program has been carefully designed by industry veterans and experts. It is designed to equip Tier 1, Tier 2, or Tier 3 SOC analysts, with the skills and knowledge necessary to perform successful SOC operations. It begins at the intermediate level and guides you through advanced digital forensics and incident response concepts. You will have hands-on experience with the most recent tools and technologies used to combat advanced cyber threats by the SOC analysts.
This comprehensive training program will allow you to:
Understanding the SOC operations, workflows and processes that are necessary to build a successful SOC team
You can get a variety of SOC tools including ELS Stack and IBM QRadar, Splunk AlienVault OSSIM and many others
Your digital forensics concepts should be strengthened, including live forensics and after-investigation.
Interpret operational threat intelligence, strategical threats intelligence, and tactical threat information
Learn how to deal with advanced persistent threats
Major tools covered in this course
The following infographic shows you the most commonly used SOC analysis tools that you will learn while in the training course.

Domains of the training program
These are the four domains you will learn during this training course.
1. SOC Operation Center2. Digital Forensics3. Incidence Response4. Threat Intelligence
These are the details for each domain and the tools exposure provided by domains:
Domain1: Security Operations Center

This domain provides a deep insight into security operation center functions and how to build a successful SOC group. The domain will allow you to understand the Security Information and Event Management (SIEM), which is the heart of an SOC team. This domain will give you detailed information about the SIEM architecture and guidelines. Other subtopics include:
Introduction to QRadar
Splunk in depth

Infosec Train’s Offensive Cyber Security Engineer Training Program (OCSE).

Infosec Train’s Offensive Cyber Security Engineer training (OCSE), is a well-designed training program for offensive cybersecurity professionals. The course is designed to equip candidates with the skills necessary to improve their ethical hacking skills and advanced penetration testing skills. The OCSE program starts with intermediate cybersecurity concepts and moves on to advanced penetration testing, system hacking, exploit development, and network security. EH and C

Skills you will learn during the OCSE training course
Innovative concepts for ethical hacking and efficient management of information security
Your own custom codes
It is important to understand the Windows and Linux environments.
Understanding corporate infrastructure at a higher level
Architecting security infrastructure and a framework for secure IT operations
Domains covered in the OCSE training
OCSE training combines the following training programs:
Training Program for Certified Ethical Hacker (CEH).
MITRE ATT&CK Training Course
Advanced Penetration Testing (APT).
ISO 27001 Fundamental Training
Security Fundamentals: This section covers the modules of CEH as well as CEH practical. This section will teach the fundamentals of cybersecurity and familiarize the candidate with the attack methods used by hackers and other offensive security professionals. Candidates will gain an in-depth knowledge of:
Reconnaissance and Footprinting
Scanning networks, vulnerability analysis, and enumeration
Hacking, malware threats and sniffing, as well as social engineering attacks, are all possible.
Session hijacking, SQL injection, Denial of Service (DOS) attack
Hacking web applications, wireless networks and mobile platforms, IoT and web servers
Evading IDS, Firewall, honeypots
Cryptography and the fundamentals of cloud computing
This section also gives you hands-on experience with some of the most popular ethical hacking tools. Here’s a list of tools covered in this section:

Infosec Train’s Advanced Penetration Testing Training (APT): Candidates will learn advanced concepts such as how to exploit network security, test intrusion detection, and respond to threats. This training will teach you how to detect and avoid these vulnerabilities. Candidates will gain a deep understanding of:
Planning and defining the scope for a penetration test
Information gathering and vulnerability identification
Diverse attacks and exploits
Active directory penetration
Communication and reporting
Here’s a list of the tools that are covered in this section.

MITRE ATT&CK Red teaming: The Infosec Train’s MITRE ATT&CK course aims to provide knowledge about the tactics and techniques used in cyberattacks by adversaries. Understanding the MITRE-ATT&CK framework is beneficial for both offensive and defensive teams. Candidates will learn about:
MITRE ATT&CK framework
MITRE ATT&CK Matrix/ MITRE ATT&CK Navigationgator
Caldera Testing
Atomic Red Team tests for MITRE ATT&CK
Here’s a list of the tools that are covered in this section.

Exploit Development: This section will cover the basics of exploit creation. Candidates will learn how to think outside the box and create their exploits. There are many situations where an offensive security professional could create their own exploit to avoid security. Skills development is essential. Candidates will learn:
Linux fundamentals
Linux stack flow vulnerabilities
Stack overflow exploitation
Linux exploit mitigation regarding stack overflow exploitation
Programming that is return-oriented

Information Technology Vs. Information Security

Many people believe that Information Security is the same thing as Information Technology, and that their I.T. is just as important. The entire cybersecurity of a person’s network is up to man. This is a common misconception since Information Technology focuses more on security than technology. This blog will discuss the distinction between Information Technology and Information Security.
Table of Contents
Information TechnologyThe Importance of Information TechnologyInformation SecurityDifference between Information Technology and Information Security

Information Technology
Information Technology (I.T.) Information Technology (I.T.) is the creation, storage, security, sharing, and processing of all types electronic data using networking, computers and other infrastructure. I.T. is a technology that is not used for personal or leisure purposes. It is used in corporate activities, such as telecommunications and computer technology.
Information Technology can be described as the use of technology on a large scale to solve commercial or organizational problems. I.T. members Departments work together to solve technical problems of all sizes.
The following are the key responsibilities of an Information Technology department:
I.T. Governance: This is the combination policy and practice that ensures I.T. Systems must work well and be in line with the organization’s requirements.
I.T. operations: The term I.T. operations: The day-to-day activities that an I.T. department performs is called “I.T. department. This category includes technical support, network maintenance, security testing, device management, and technical assistance.
Hardware and infrastructure: These are the physical components of I.T. This area includes infrastructure. This I.T. This I.T. pillar includes routing, servers, phone system, and laptop maintenance.
Information Technology’s Importance
Information Technology can increase productivity: A well-developed Information Technology will allow you to do more in less time.
I.T. I.T. provides benefits for businesses such as faster communication and the ability electronically store and secure important documents.
Technology advancements are fueled by both changes in market conditions and fundamental shifts within computer hardware.
I.T. Electronic storage systems are offered by I.T.
Information Technology includes code/programming, data transmissions, data conversions, storage and retrieval as well as systems analysis and design and control.
These can be used in conjunction with technology to gather, process, and deliver data.
Remote working is easy with I.T. infrastructure allows employees to work remotely from home or at other locations.
Information Security
Unauthorized access is just one aspect of Information Security. Information Security’s primary goal is to protect information from illegal access, use and disclosure. Information can be stored in physical or electronic form. Information could include your personal information such as your social media profile details, mobile phone data, and even biometrics. Information Security can be studied in many different disciplines, including cryptography, mobile computing and cyber forensics.
Information Security has three goals:
Confidentiality: Confidentiality is a way to protect information from unauthorized use or disclosure. It guarantees that only authorized persons can access it. Privacy is closely related to personal data.
Integrity: Integrity is the ability to make sure that a system’s data and systems are safe from abuse.

Information gathering using Recon-ng Tool

Recon-ng, a Python-based Web Reconnaissance Framework, is fully featured. Recon-ng offers a powerful environment for open-source web-based reconnaissance. It includes modules that can be used independently, database interaction, convenience functions, interactive help and command completion. Recon-ng is similar to the Metasploit framework, which makes it easier to leverage the framework. It is however quite different. Recon-ng is not meant to compete with existing frameworks. It is only for web-based open-source reconnaissance. Metasploit Framework is recommended if you are looking to exploit. The Social Engineer Toolkit is available for Social Engineers. Recon-ng is a tool that allows you to do reconnaissance.

To start Recon-ng under Kali Linux in the terminal type.

To increase work space
Workspaces can be added to pen_test

To add domains to which you wish to gather information, type
Add domains
To verify that the domain has been successfully added, type
Show domains

Check the available modules now
Command :
Show modules
A module is a task that recon-ng will perform based on the parameters you give it. The Recon category has the most modules.

Search the domain to find contact information.
use recon/domains-contacts/whois_pocs
Show options (it will show source option)
Run (contacts and email addresses will be displayed).

For evidence of compromise, search the account
use recon/contacts-credentials/hibp_breach
This module search that has I been pawned? ?HIBP database to determine if an email account has been affected by major breaches in the last few years.
Set source email address (insert email address you found in the previous step to verify that I have not been compromised in the last few years).

Identify your organization’s social media presence
use recon/profiles-profiles/profiler
Set source comptia (here domain name will be domain without the top-level domain suffix).

You can also use different modules to gather information about your organization.
DNS Records for identifying organization mail
Search subdomains
Command :
Finally, you can generate a report of your findings
Command :
Reporting /html
Show options
Set creator (your name).
Set customer (clients name)
set filename /root/desktop/recon_report.html
Double-click recon_report.html to open the report
The harvester tool, which we discussed in an earlier article, can be used to collect additional information such as email addresses and host information. Stay tuned …. for more information on cyber security.
Certified Ethical Hacker (CEH version 10)

CompTIA PenTest+

Informatica Launches Data Lake Management Solution, Enhances AWS Integrations Informatica is a provider of cloud-based data management solutions for enterprises. Informatica has increased its toolset to help organizations manage large amounts of data in Amazon Web Services (AWS). The company’s new Governed Data Lake Management Solution was launched on Monday. This coincided with the start of the AWS re-Invent conference, which is now a three-week virtual event. Informatica’s solution is designed to help organizations manage their AWS data lakes. It includes features for data management, data cataloging and setting and enforcing privacy rules. The Governed Data lake Management Solution will allow users to:

  • Intelligent Data Cataloging allows you to discover and organize data assets in your enterprise, automatically curate and augment metadata with business context, and infer relationships with lineage and other information.
  • Cloud-Native Data Integration allows you to quickly and efficiently create data pipelines and migrate data workloads from on-premises to Amazon S3 data lakes.
  • To ensure trusted data delivery across the enterprise, and compliance with regulations like GDPR and CCPA, define and enforce data privacy policies.
  • Automatedly identify and fix data quality problems, ensuring data lake users have access to trusted, clean data.

Informatica’s metadata driven, intelligent cloud data management capabilities allow organizations to realize the promise cloud data warehouses and data lakes on AWS by automating data delivery. This will enable them to accelerate innovation. Informatica also announced AWS-related enhancements to the Informatica Intelligent Cloud Services product (IICS), which is described as a “next generation iPaaS” or Integration Platform as a service. These improvements are designed to help Informatica customers manage their AWS data assets.

  • Ability to perform serverless extract transform and load (ETL).
  • Amazon S3 data can be managed “code-free” using the new Amazon Athena connector.
  • A new feature called “pushdown optimization” that allows for seamless data processing.

“With our longstanding relationship to AWS, we enable our customers across all industries, public and private, to derive greater value out their data and ultimately drive improved business outcomes,” Rik Tamm-Daniels (Informatica’s head for Strategic Ecosystems and Technology), stated in a prepared statement.

AWS snags former Netflix Cloud Guru for VP role

Amazon Web Services (AWS), announced Monday that Adrian Cockcroft has been named its vice president of Cloud Architecture. CTO Werner Vogels also announced the appointment.
Cockcroft’s six-year tenure at Netflix is a highlight of his resume. He was first the director of its Web Engineering team, then as its Cloud Architect. Cockcroft was responsible for the company’s transition to a fully cloud-native architecture based on AWS public clouds. This seven-year-long process began in 2000 and ended earlier this year.
Cockcroft was also a Technology Fellow at Battery Ventures, where his advice to portfolio companies and cloud migration plans was provided.
He has also held engineering positions with Sun Microsystems, Cambridge Consultants, and eBay.
Cockcroft is also a member AWS’ Community Heroes program. This program recognizes “AWS experts that go above and beyond to share their knowledge through social media, blogs and events, user groups, workshops, and user groups.”
Cockcroft stated in a prepared statement that he had worked closely with AWS for seven years and was thrilled to join the cloud computing leader. “The current state-of-the art in infrastructure, software packages and services is a combination AWS and open-source tools. They are all available to everyone. This democratization of technology access means everyone can learn and compete for the best.
Vogels stated that Cockcroft will work closely with AWS executives, product groups, and consult with customers about their cloud architectures — whether they are startups that were created in the cloud or large Web-scale enterprises or enterprises that have an ‘all in’ migration strategy. Adrian will also engage with developers in the Amazon-sponsored, supported open source communities.

AWS Reduces Pricing for AppStream 2.0 Educational Customers

Amazon Web Services’ desktop application streaming solution can now be used by educational institutions at a steeply discounted price.
AppStream 2.0, a managed service, was launched at last year’s AWS reInvent conference. It allows customers to stream desktop applications over cloud to any HTML5 capable browser on a laptop or computer, regardless of whether it runs Windows, Linux or macOS. According to this FAQ, AppStream 2.0 can stream any app that runs on Windows Server 2012R2 64-bit.
AWS has created AppStream 2.0 for education users. This allows schools to offer students of all ages access creative and educational applications, such as Adobe Photoshop or specialized STEM software programs.
AWS announced Wednesday that it would reduce the monthly per-user cost of AppStream 2.0 for qualified educational users by 89 percent. The previous monthly per-user fee for AppStream 2.0 was $4.19. This cost was set by Microsoft’s Remote Desktop Services Subscription Access License, (RDS-SAL), for Windows Server.
The monthly per-user fee for AppStream 2.0 is now $0.44. AWS announced that the price reduction will make it easier for educators to integrate AppStream 2.0 into their classrooms.
To request a discounted price, education customers may fill out this form.
AppStream 2.0 customers pay for the amount of compute resources they use. This includes per-hour instance fees, which can range from $0.10 an hour to $8.20 for more graphic-intensive workloads.

AWS Reduces the Price of CloudEndure Disaster Recovery

Amazon Web Services (AWS), announced this week a significant price drop of around 80 percent for its CloudEndure disaster recovery services, which it acquired last year.
CloudEndure Disaster Recovery promises fast and low-cost recovery for your organization’s servers and databases, no matter if they are on-premises or cloud-based.
CloudEndure Disaster Recovery replicates your entire system (including operating system, system configuration, databases, applications, files, and files) to a low-cost staging space in your AWS account. CloudEndure Disaster Recovery can launch thousands of your machines automatically in a fully provisioned state if there is a disaster.
This process eliminates the need to purchase additional software licenses or duplicate infrastructure. It also reduces compute costs by up to 95 percent, according to Jeff Barr, AWS evangelist, in a Monday blog post.
Barr reports that AWS is sweetening things even more with an 80 percent price drop, which brings CloudEndure’s cost to “$0.028 an hour, or about $20 per server per month,” according to Barr.
The company is also changing the CloudEndure billing model, moving it from a contract-based to a usage-based model. According to the AWS announcement, customers no longer have to sign a contract obligated to pay for a certain amount of servers. Instead, they can pay for their hourly usage of servers.
The company stated that by moving from contract-based billing to usage-based billing, it aligns with the AWS consumption model that offers customers greater flexibility.

Amazon Web Services Inc. (AWS), a company that provides Elastic Compute Cloud (EC2) services, announced the latest round in price reductions. According to Jeff Barr’s blog post, Tuesday’s price cuts represent the 51st price cut for AWS. The following EC2 instances have been affected:

  • C4 and M4 running Linux are now 5 percent cheaper in all AWS regions, except South America.
  • R3 Linux: These instances have a lower On-Demand and Reserved host price than the Dedicated host prices in all AWS regions. Additionally, the On-Demand or Reserved R3 instances that run Linux in AWS GovCloud are 5 percent cheaper.

Barr reports that smaller reductions are available for the same instance types running Windows, SLES and RHEL in the mentioned regions. Barr reports that Tuesday’s announcement is just one of a series of moves AWS has made to reduce cloud costs. The company’s AWS Price List API was released in December. This API gives users detailed pricing information for their various AWS services. Barr stated that the API will be updated to reflect Tuesday’s price reductions. AWS also released the T2.nano instance type, which is the cheapest EC2 instance type and is intended for low-traffic workloads.

AWS Reduces EC2 Instance Price by Up to 21 Percent

Amazon Web Services (AWS), announced Wednesday its 61st round price cuts. This will lower the cost of some EC2 Reserved Instances up to a fifth of their original prices.
“Convertible Reserved Instances” will be available immediately for between 5 and 21 per cent less. These three-year-terms allow customers to switch between instance types throughout their contracts. According to Jeff Barr, AWS evangelist, the discount applies to the C4, R4, I3, X1 or T2 instance families.
Here’s an example of how the discount might affect certain instances
[Click on the image to see a larger view.] Price reductions for Convertible Reservation Instances in selected instance families/regions. Source: Jeff Barr/AWS. The company will also offer a “no upfront payment” option to customers who rent Reserved Instances on three-year terms. This option, which allows customers to pay their Reserved Instances in monthly installments rather than in full or with a partial payment, was previously only available to customers with one-year contracts. Customers with three-year C4, M4, and R4 Standard Reserved Instances contracts can now take advantage of this installment plan.
“Our customers use multiple strategies when purchasing and managing their Reserved Instances. Some prefer to make an upfront payment to get a larger discount, while others prefer to pay nothing upfront to get a smaller but still significant discount. Some people are happier with a partial upfront payment, while others prefer a discount that is between the two options. Barr stated that to accommodate this wide range of preferences, we have added 3 Year No Upfront Standard Reserved Incenses for most of the current generation type instances.
Additionally, these Reserved Instances with no upfront costs will be available at a lower cost for both one-year and three-year contracts. Below is a sample of how this discount will impact customers:
[Click on the image to see a larger view.] Estimated price cuts for Reserved Instance contracts with no upfront payments in selected instance families/regions. Source: Jeff Barr/AWS. AWS is reducing the price of its general-purpose M4 instances family by up to 7% for Linux.