You are subscribed to National Cyber Awareness System Current Activity for
Cybersecurity and Infrastructure Security Agency. This information has recently
been updated, and is now available.
CISA and the Federal Bureau of Investigation (FBI) continue to respond to
the recent supply-chain ransomware attack leveraging a vulnerability in Kaseya
VSA software against multiple managed service providers (MSPs) and their
customers. CISA and FBI strongly urge affected MSPs and their customers to
follow the guidance below.
CISA and FBI recommend affected MSPs:
Contact Kaseya at support@kaseya.com
with the subject “Compromise Detection Tool Request” to obtain and run
Kaseya’s Compromise Detection Tool available to Kaseya VSA customers. The
tool is designed to help MSPs assess the status of their systems and their
customers’ systems.
Enable and enforce multi-factor authentication (MFA) on
every single account that is under the control of the organization, and—to
the maximum extent possible—enable and enforce MFA for customer-facing
services.
Implement allowlisting to limit communication with
remote monitoring and management (RMM) capabilities to known IP address
pairs, and/or
Place administrative interfaces of RMM behind a virtual
private network (VPN) or a firewall on a dedicated administrative network.
CISA and FBI recommend MSP customers affected by this attack take immediate
action to implement the following cybersecurity best practices. Note: these actions
are especially important for MSP customer who do not currently have their RMM
service running due to the Kaseya attack.
CISA and FBI recommend affected MSP customers:
Ensure backups are up to date and stored in an easily
retrievable location that is air-gapped from the organizational network;
Revert to a manual patch management process that
follows vendor remediation guidance, including the installation of new
patches as soon as they become available;
Implement:
Multi-factor
authentication; and
Principle of least
privilege on key network resources admin accounts.
Resources:
CISA and FBI provide these resources for the reader’s awareness. CISA
and FBI do not endorse any non-governmental entities nor guarantee the accuracy
of the linked resources.
For indicators of compromise, see Peter Lowe’s GitHub
page REvil
Kaseya CnC Domains. Note: due to the urgency to share this information, CISA and FBI
have not yet validated this content.
Special thanks to @Yuval Naor , @Yaron
Fruchtmann , and @Batami Gold , who made all this possible.
Why should you
care?
Cross
source detection: Normalized Authentication analytic rules work across
sources, on-prem and cloud, now detecting attacks such as brute force or
impossible travel across systems including Okta, AWS, and Azure.
Source
agnostic rules:
process event analytics support any source that a customer may use to
bring in the data, including Defender for Endpoint, Windows Events, and
Sysmon. We are ready to add Sysmon for Linux and WEF once released!
EDR support: Process,
Registry, Network, and Authentication consist the core of EDR event
telemetry.
Ease
of use:
The Network Schema introduced last year is now
easier to use with a single-click ARM template deployment.
Jon us to learn more about the Azure Sentinel information model in two webinars:
The
Information Model: Understanding Normalization in Azure Sentinel
Deep Dive into Azure Sentinel Normalizing
Parsers and Normalized Content
Why normalization, and what is the Azure Sentinel
Information Model?
Working with various data types and tables together presents a challenge.
You must become familiar with many different data types and schemas, write
and use a unique set of analytics rules, workbooks, and hunting queries for
each, even for those that share commonalities (for example, DNS servers).
Correlation between the different data types necessary for investigation and
hunting is also tricky.
The Azure Sentinel Information Model (ASIM) provides a seamless experience
for handling various sources in uniform, normalized views. ASIM aligns with
the Open-Source Security Events Metadata (OSSEM) common
information model, promoting vendor agnostic, industry-wide normalization.
ASIM:
Allows source agnostic
content and solutions
Simplifies analyst use
of the data in sentinel workspaces
The current implementation is based on query time normalization using KQL
functions. And includes the following:
Normalized
schemas cover
standard sets of predictable event types that are easy to work with and
build unified capabilities. The schema defines which fields should
represent an event, a normalized column naming convention, and a standard
format for the field values.
Parsers map existing data
to the normalized schemas. Parsers are implemented using KQL functions.
Content
for each normalized schema includes analytics rules, workbooks, hunting
queries, and additional content. This content works on any normalized data
without the need to create source-specific content.
The pandemic has permanently changed how organizations of all sizes work. A
substantial increase in hybrid and remote work has presented new compliance
challenges, and organizations have responded by growing their compliance
functions. A recent study shows that there were 257 average daily regulatory
alerts across 190 countries in 2020 and keeping up with regulatory changes continues
to be the top compliance challenge[1].
To help organizations simplify compliance and reduce risk,
we built Microsoft Compliance Manager, generally available since September 2020. Compliance
Manager translates complex regulatory requirements into specific recommended
actions and makes them available through premium assessment templates, covering
over 300 regulations and standards. By leveraging the universal mapping of
actions and controls, premium assessment templates allow customers to comply
with several requirements across multiple regulations or standards with one
action, providing an efficient solution to manage overlapping compliance
requirements. Premium assessment templates along with built-in workflows and
continuous compliance updates allow organizations to constantly assess,
monitor, and improve their compliance posture.
To meet customers where they are in their compliance journey, we are excited
to announce that Compliance Manager premium assessment templates will no longer
require a Microsoft 365 E5 or Office 365 E5 license as a prerequisite. This
update enables all enterprise customers to assess compliance with the
regulations most relevant to them and meet their unique compliance needs.
Starting July 1st, 2021, all Enterprise customers, both commercial
and government, can purchase premium assessment templates as long as they have
any Microsoft 365 or Office 365 subscription. Customers who have already
purchased a premium assessment template or are using the default templates
included with their subscription will not experience any disruption or change.
Customers with Microsoft 365 E1/E3 or Office 365 E1/E3 subscriptions will now
be able to see the list of 300+ premium assessment templates in their tenants.
The capability to create a new template, customize an existing template, or add
customized actions to a given template will continue to require a Microsoft 365
E5 or Office 365 E5 subscription.
We look forward to hearing your feedback.
Get Started
Navigate to the Microsoft 365 compliance center or sign up for a
Microsoft 365 E5 Compliance trial to get started with Compliance Manager premium
assessments today! Compliance Manager premium assessment SKUs can be
purchased in Microsoft
admin center.
You are subscribed to National Cyber Awareness System Current Activity for
Cybersecurity and Infrastructure Security Agency. This information has recently
been updated, and is now available.
The CERT Coordination Center (CERT/CC) has released a VulNote for a critical remote
code execution vulnerability in the Windows Print spooler service, noting:
“while Microsoft has released an update
for CVE-2021-1675, it is important to realize that this update does not
address the public exploits that also identify as CVE-2021-1675.” An attacker
can exploit this vulnerability—nicknamed PrintNightmare—to take control of an
affected system.
CISA encourages administrators to disable the Windows Print spooler service
in Domain Controllers and systems that do not print. Additionally,
administrators should employ the following best practice from Microsoft’s how-to
guides, published January 11, 2021: “Due to the possibility for exposure,
domain controllers and Active Directory admin systems need to have the Print
spooler service disabled. The recommended way to do this is using a Group
Policy Object.”
CISA has released a new module in its
Cyber Security Evaluation Tool (CSET): the Ransomware Readiness Assessment
(RRA). CSET is a desktop software tool that guides network defenders through
a step-by-step process to evaluate their cybersecurity practices on their
networks. CSET—applicable to both information technology (IT) and industrial
control system (ICS) networks—enables users to perform a comprehensive
evaluation of their cybersecurity posture using many recognized government
and industry standards and recommendations.
The RRA is a self-assessment based on a
tiered set of practices to help organizations better assess how well they are
equipped to defend and recover from a ransomware incident. CISA has tailored
the RRA to varying levels of ransomware threat readiness to make it useful to
all organizations regardless of their current cybersecurity maturity. The
RRA:
Helps
organizations evaluate their cybersecurity posture, with respect to
ransomware, against recognized standards and best practice
recommendations in a systematic, disciplined, and repeatable manner.
Guides
asset owners and operators through a systematic process to evaluate
their operational technology (OT) and information technology (IT)
network security practices against the ransomware threat.
Provides
an analysis dashboard with graphs and tables that present the assessment
results in both summary and detailed form.
CISA strongly encourages all
organizations to take the CSET Ransomware Readiness Assessment, available at https://github.com/cisagov/cset/.
Microsoft is pleased to announce the publication of the Security Stack
Mappings for Azure project in partnership with the Center for Threat-Informed
Defense.
In May we announced the support for Linux across our threat and
vulnerability management capabilities in Microsoft Defender for Endpoint.
Today, we are excited to announce that threat and vulnerability management for
Linux is now generally available across Red
Hat, Ubuntu, CentOS, SUSE, and Oracle, with support for Debian coming
soon. In addition to Linux, the threat and vulnerability management
capabilities already support macOS and Windows, with support for Android and
iOS coming later this summer to further expand our support of third party
platforms.
Vulnerability Management plays a crucial role in monitoring an
organization’s overall security posture. That’s why we continue to expand our
cross-platform support to equip security teams with real-time insights into
risk with continuous vulnerability discovery, intelligent prioritization, and
the ability to seamlessly remediate vulnerabilities for all their platforms.
With the general availability of support for Linux, organizations can now
review vulnerabilities within installed apps across the Linux OS and issue
remediation tasks for affected .
Image 1:
Software inventory page in the vulnerability management console, showing
various Linux platforms
Image 2:
Software inventory page in the vulnerability management portal, showing glibc
across various Linux systems
Support for the various Linux platforms in threat and vulnerability management
closely follows what is available across our Endpoint Detection and Response
(EDR) capabilities. This alignment ensures a consistent experience for
Microsoft Defender for Endpoint customers, as we continue to expand our
cross-platform support.
More information and
feedback
The threat and vulnerability management capabilities are part of Microsoft Defender for Endpoint and enable
organizations to effectively identify, assess, and remediate endpoint
weaknesses to reduce organizational risk.
Check out our documentation for a complete overview of supported
operating systems and platforms.
Last week, on Monday June 14th, 2021, a new
version of the Windows Security Events data connector reached public preview. This is
the first data connector created leveraging the new
generally available Azure Monitor Agent (AMA) and Data Collection Rules (DCR) features from the Azure
Monitor ecosystem. As any other new feature in Azure Sentinel, I
wanted to expedite the testing process and empower others in the InfoSec
community through a lab environment to learn more about it.
In this post, I will talk about the
new features of the new data connector and how to automate the
deployment of an Azure Sentinel instance with the
connector enabled, the creation and association of DCRs
and installation of the AMA on a Windows workstation. This is an
extension of a blog post I wrote, last year (2020), where I covered
the collection of Windows security events via the Log Analytics Agent (Legacy).
Recommended
Reading
I highly recommend reading the following blog posts
to learn more about the announcement of the new Azure Monitor
features and the Windows Security Events data connector:
Azure Sentinel2Go is an open-source project maintained
and developed by the Open Threat Research community to automate the deployment of an Azure
Sentinel research lab and a data ingestion pipeline to consume
pre-recorded datasets. Every environment I release through this
initiative is an environment I use and test while performing research as part
of my role in the MSTIC R&D team. Therefore, I am constantly trying to
improve the deployment templates as I cover more scenarios. Feedback
is greatly appreciated.
A
New Version of the Windows Security Events Connector?
According to Microsoft docs, the Windows Security Events connector lets you stream
security events from any Windows server (physical or virtual, on-premises or in
any cloud) connected to your Azure Sentinel workspace. After last week,
there are now two versions of this connector:
Security
events (legacy version):
Based on the Log Analytics Agent (Usually known as the Microsoft
Monitoring Agent (MMA) or Operations Management Suite
(OMS) agent).
Windows
Security Events (new version):
Based on the new Azure Monitor Agent (AMA).
In your Azure Sentinel data connector’s view, you
can now see both connectors:
A
New Version? What is New?
Data
Connector Deployment
Besides using the Log Analytics Agent to
collect and ship events, the old connector uses the Data Sources resource from the Log Analytics Workspace resource to set the collection tier
of Windows security events.
The new connector, on the other hand, uses a combination of Data Connection Rules (DCR) and Data Connector
Rules Association (DCRA).
DCRs define what data to collect and where it should be
sent. Here is where we can set it to send data to the log analytics
workspace backing up our Azure Sentinel instance.
In order to apply a DCR to a virtual machine, one
needs to create an association between the machine and the rule. A
virtual machine may have an association with multiple DCRs, and a DCR
may have multiple virtual machines associated with it.
For more detailed information about
setting up the Windows Security Events connector with both Log Analytics Agent
and Azure Monitor Agents manually,
take a look at this document.
Data
Collection Filtering Capabilities
The old connector is not flexible enough to
choose what specific events to collect. For example, these are the
only options to collect data from Windows machines with the old connector:
All events –
All Windows security and AppLocker events.
Common –
A standard set of events for auditing purposes. The Common event set
may contain some types of events that aren’t so common. This is because
the main point of the Common set is to reduce the volume of events to a
more manageable level, while still maintaining full audit trail capability.
Minimal –
A small set of events that might indicate potential threats. This set does
not contain a full audit trail. It covers only events that might indicate
a successful breach, and other important events that have very low rates
of occurrence.
None –
No security or AppLocker events. (This setting is used to disable the
connector.)
According to Microsoft docs, these are
the pre-defined security event collection groups
depending on the tier set:
On the other hand, the new connector allows custom data collection via XPath queries. These XPath queries
are defined during the creation of the data collection rule and
are written in the form of LogName!XPathQuery. Here
are a few examples:
Collect only Security events with Event ID = 4624
Security!*[System[(EventID=4624)]]
Collect only Security events with Event ID = 4624 or
Security Events with Event ID = 4688
Security!*[System[(EventID=4624 or EventID=4688)]]
Collect only Security events with Event ID = 4688 and
with a process name of consent.exe.
Security!*[System[(EventID=4688)]] and *[EventData[Data[@Name=’ProcessName’]
=’C:WindowsSystem32consent.exe’]]
You can select the custom
option to select which events to stream:
Important!
Based on the new connector docs, make sure to query only Windows Security and AppLocker
logs. Events from other Windows logs, or from security logs from other
environments, may not adhere to the Windows Security Events schema and won’t be
parsed properly, in which case they won’t be ingested to your workspace.
Also, the Azure Monitor agent supports XPath
queries for XPath version 1.0 only. I recommend reading the Xpath 1.0 Limitation documentation before writing XPath Queries.
XPath?
XPath stands for XML (Extensible
Markup Language) Path language, and it is used to
explore and model XML documents as a tree of nodes. Nodes can be
represented as elements, attributes, and text.
In the image below, we can see a few node examples
in the XML representation of a Windows security event:
XPath Queries?
XPath queries are used to search for patterns in XML
documents and leverage path expressions and predicates
to find a node or filter specific nodes that contain a specific
value. Wildcards such as ‘*’
and ‘@’
are used to select nodes and predicates are always embedded in square brackets “[]”.
Matching any element node with ‘*’
Using our previous Windows Security event
XML example, we can process Windows Security events using the
wildcard ‘*’ at
the `Element` node level.
The example below walks
through two ‘Element’ nodes
to get to the ‘Text’
node of value ‘4688’.
You can test this basic ‘XPath’ query via
PowerShell.
Open a PowerShell console as ‘Administrator’.
Use the Get-WinEvent command to pass the XPath query.
Use the ‘Logname’ parameter to define what event
channel to run the query against.
Use the ‘FilterXPath’ parameter to set the XPath query.
As shown before, ‘Element’ nodes can contain ‘Attributes’ and we can
use the wildcard ‘@’
to search for ‘Text’
nodes at the ‘Attribute’
node level. The example below extends the
previous one and adds a filter to search for a specific ‘Attribute’ node that
contains the following text: ‘C:WindowsSystem32cmd.exe’.
Once again, you can test the XPath query via
PowerShell as Administrator.
$XPathQuery = “*[System[EventID=4688]] and *[EventData[Data[@Name=’ParentProcessName’]=’C:WindowsSystem32cmd.exe’]]”
Every time you add a filter through the Event
Viewer UI, you can also get to the XPath query representation of the
filter. The XPath query is part of a QueryList node which
allows you to define and run multiple queries at once.
We can take our previous example where we searched
for a specific attribute and run it through the Event Viewer Filter
XML UI.
<QueryList> <Query Id="0" Path="Security"> <Select Path="Security">*[System[(EventID=4688)]] and *[EventData[Data
Now that we have covered some of the main changes
and features of the new version of the Windows Security Events data connector,
it is time to show you how to create a lab environment for you to test your own
XPath queries for research purposes and before pushing
them to production.
Deploy Lab Environment
Identify the right Azure resources
to deploy.
Create deployment template.
Run deployment template.
Identify the Right Azure Resources to Deploy
As mentioned earlier in this post, the old connector uses the Data Sources resource from the Log Analytics Workspace resource to set the collection tier of Windows
security events.
This is the Azure Resource Manager (ARM) template I
use in Azure-Sentinel2Go to set it up:
{ "name": "WORKSTATION5/microsoft.insights/WindowsDCR", "type": "Microsoft.Compute/virtualMachines/providers/dataCollectionRuleAssociations", "apiVersion": "2019-11-01-preview", "location": "eastus", "properties": { "description": "Association of data collection rule. Deleting this association will break the data collection for this virtual machine.", "dataCollectionRuleId": "DATACOLLECTIONRULEID" } }
What
about the XPath Queries?
As shown in the previous section, the XPath query
is part of the “dataSources”
section of the data collection rule resource. It is defined under the ‘windowsEventLogs’ data
source type.
Open a terminal where
you can run Azure CLI from (i.e. PowerShell).
Log in to your Azure
Tenant locally.
az login
Create Resource Group
(Optional)
az group create -n AzSentinelDemo -l eastus
Deploy ARM template
locally.
az deployment group create –f ./ Win10-DCR-AzureResource.json -g MYRESOURCRGROUP –adminUsername MYUSER –adminPassword MYUSERPASSWORD –allowedIPAddresses x.x.x.x
Wait 5-10 mins and your
environment should be ready.
Whether you use the UI or the CLI, you can monitor
your deployment by going to Resource Group > Deployments:
Verify
Lab Resources
Once your environment is deployed successfully, I
recommend verifying every resource that was deployed.
Azure
Sentinel New Data Connector
You will see the Windows Security Events (Preview) data
connector enabled with a custom Data
Collection Rules (DCR):
If you edit the custom DCR, you will see the XPath
query and the resource that it got associated with. The image below shows the
association of the DCR with a machine named workstation5.
You can also see that the data collection is set to custom and, for this
example, we only set the event stream to collect events with Event ID 4624.
Windows Workstation
I recommend to RDP to the Windows Workstation by
using its Public IP Address. Go to your resource group and select the Azure VM.
You should see the public IP address to the right of the screen.
This would generate authentication events which will be captured by
the custom DCR associated with the endpoint.
Check Azure Sentinel Logs
Go back to your Azure Sentinel, and you should
start seeing some events on the Overview page:
Go to Logs
and run the following KQL query:
SecurityEvent | summarize count() by EventID
As you can see in the image below, only events with
Event ID 4624 were collected by the Azure Monitor Agent.
You might be asking yourself, “Who would only want
to collect events with Event ID 4624 from a Windows endpoint?”.
Believe it or not, there are network environments where due to bandwidth
constraints, they can only collect certain events. Therefore, this custom
filtering capability is amazing and very useful to cover more use cases and
even save storage!
Any Good XPath Queries Repositories in the InfoSec
Community?
Now that we know the internals of the new connector
and how to deploy a simple lab environment, we can test multiple XPath queries
depending on your organization and research use cases and bandwidth
constraints. There are a few projects that you can use.
Palantir WEF Subscriptions
One of many repositories out there that
contain XPath queries is the ‘windows-event-forwarding’ project from Palantir. The XPath queries are Inside of the Windows
Event Forwarding (WEF) subscriptions. We could take all the subscriptions
and parse them programmatically to extract all the XPath
queries saving them in a format that can be used to be part of the
automatic deployment.
You can run the following steps in this document
available in Azure Sentinel To-go and extract XPath queries from the Palantir
project.
From a community perspective, another great
resource you can use to extract XPath Queries from is the Open
Source Security Event Metadata (OSSEM) Detection Model (DM) project. A community driven effort to help
researchers model attack behaviors from a data perspective and
share relationships identified in security events across several operating
systems.
One of the advantages of this project over others is
that all its data relationships are in YAML format which makes
it easy to translate to others formats. For example, XML. We can use
the Event IDs defined in each data
relationship documented in OSSEM DM and create XML files with XPath queries
in them.
We can process all the YAML files and export the
data in an XML files. One thing that I like about this
OSSEM DM use case is that we can group the XML files by ATT&CK data sources. This can help organizations
organize their data collection in a way that can be mapped to detections or
other ATT&CK based frameworks internally.
We can use the QueryList format to document all ‘scheduled jobs relationships‘
XPath queries in one XML file.
I like to document my XPath
queries first in this format because it expedites the validation
process of the XPath queries locally on a Windows endpoint. You can use that
XML file in a PowerShell command to query Windows Security events and make
sure there are not syntax issues:
Finally, once the XPath queries have been
validated, we could simply extract them from the XML files and put them in a
format that could be used in ARM templates to create DCRs. Do you
remember the dataSources property
of the DCR Azure resource we talked about earlier? What if we could get
the values of the windowsEventLogs data
source directly from a file instead of hardcoding them in an ARM template? The
example below is how it was previously being hardcoded.
We could use the XML files created after processing
OSSEM DM relationships mapped to ATT&CK data sources and creating
the following document. We can pass the URL of the document as a parameter in
an ARM template to deploy our lab environment:
The OSSEM team is contributing and
maintaining the JSON file from the previous section in
the Azure Sentinel2Go repository. However, if you want to
go through the whole process on your own, Jose Rodriguez (@Cyb3rpandah) was kind enough to write every single
step to get to that output file in the following blog post:
Ok, But, How Do I Pass the JSON
file to our Initial ARM template?
In our initial ARM template, we had the XPath query as an
ARM template variable as shown in the image below.
We could also have it as a template parameter.
However, it is not
flexible enough to define multiple DCRs or even update the whole DCR Data
Source object (Think about future coverage beyond Windows
logs).
Data Collection Rules – CREATE API
For more complex use cases, I would use the DCR Create API. This can be executed via a PowerShell script
which can also be used inside of an ARM template via deployment scripts. Keep in mind that, the deployment script resource requires
an identity to execute the script. This managed identity of type user-assigned can be created at
deployment time and used to create the DCRs programmatically.
PowerShell Script
If you have an Azure Sentinel instance without the
data connector enabled, you can use the following PowerShell script to
create DCRs in it. This is good for testing and it also works in ARM templates.
Keep in mind, that you would need to have a
file where you can define the structure of the windowsEventLogs data source object
used in the creation of DCRs. We created that in the previous section remember? Here is where we can use the OSSEM Detection
Model XPath Queries File 😉
One thing to remember is that you can only have 10
Data Collection rules. That is
different than XPath queries inside of one DCR. If you attempt to create more than 10 DCRs, you
will get the following error message:
ERROR
VERBOSE: @{Headers=System.Object[]; Version=1.1; StatusCode=400; Method=PUT; Content={"error":{"code":"InvalidPayload","message":"Data collection rule is invalid","details":[{"code":"InvalidProperty","message":"'Data Sources. Windows Event Logs' item count should be 10 or less. Specified list has 11 items.","target":"Properties.DataSources.WindowsEventLogs"}]}}}
Also, if you have duplicate XPath queries in one DCR,
you would get the following message:
ERROR VERBOSE: @{Headers=System.Object[]; Version=1.1; StatusCode=400; Method=PUT; Content={"error":{"code":"InvalidPayload","message":"Data collection rule is invalid","details":[{"code":"InvalidDataSource","message":"'X Path Queries' items must be unique (case-insensitively).
Now that you know how to use a PowerShell script to
create DCRs directly to your Azure Sentinel instance, we can use it inside of
an ARM template and make it point to the JSON file that contains all the XPath
queries in the right format contributed by the OSSEM DM project.
This is the template I use to put it all together:
You still need to associate the DCR with a virtual
machine. However, we can keep doing that within the template leveraging the DCRAs Azure resource linked template inside of the main
template. Just in case you were wondering how I call the linked template from
the main template, I do it this way:
The same way how we deployed the initial one. If you want the Easy Button , then simply
browse to the URL below and click on the blue button highlighted in the image
below:
That’s it! You now know two ways to deploy and
test the new data connector and Data
Collection Rules features with XPath queries capabilities. I hope this
was useful. Those were all my notes while testing and developing templates
to create a lab environment so that you could also expedite the testing process!
Researchers at Positive Technologies have published a proof-of-concept exploit for CVE-2020-3580. There are reports of researchers pursuing bug bounties using this exploit.
n October 21, 2021, Cisco released a security advisory and patches to address multiple cross-site scripting (XSS) vulnerabilities in its Adaptive Security Appliance (ASA) and Firepower Threat Defense (FTD) software web services. In April, Cisco updated the advisory to account for an incomplete fix of CVE-2020-3581.
Shortly after, Mikhail Klyuchnikov, a researcher at Positive Technologies also tweeted that other researchers are chasing bug bounties for this vulnerability. Tenable has also received a report that attackers are exploiting CVE-2020-3580 in the wild.
Analysis
All four vulnerabilities exist because Cisco ASA and FTD software web services do not sufficiently validate user-supplied inputs. To exploit any of these vulnerabilities, an attacker would need to convince “a user of the interface” to click on a specially crafted link. Successful exploitation would allow the attacker to execute arbitrary code within the interface and access sensitive, browser-based information.
These vulnerabilities affect only specific AnyConnect and WebVPN configurations:
Proof of concept
As mentioned earlier, there is a public PoC published by Positive Technologies on Twitter, which has gained significant attention.
Vendor response
Cisco has not issued any additional information or updates since the PoC was published.
Throughout the last several months there have been
many new features, updates, and happenings in the world of Information
Protection at Microsoft. As we continue to build out more of this story, we
wanted to use this opportunity to connect with customers, partners, and more on
some of these updates to keep you informed and provide a single pane of glass
on everything we have been working on for the last several months. In addition,
we hope to give you some insight into the next big things being built within
MIP overall.
Office apps (Word, Excel, PowerPoint, Outlook) will now
respect the Admin policy setting to require users to apply a
label to documents and emails on Windows, Mac, iOS,
and Android (for the Office 365 subscription version of the
apps).
Co-authoring and AutoSave on Microsoft Information
Protection-encrypted documents
Client-based automatic and recommended labeling on Mac
Mandatory labeling requiring users to apply a label to
their email and documents
Availability of audit label activities in Activity
Explorer
Native support for variables and per-app content marking
You can leverage co-authoring using:
Production or test tenant
Microsoft 365 apps with the following versions:
Windows – Current Channel 16.0.14026.20270+ (2105)
Mac: 16.50.21061301+
If AIP Unified Labeling Client
Version is in use, verify that in addition to the updated Microsoft
365 app, you use version 2.10.46.0 of the Unified Labeling
client.
PLEASE NOTE: That Co-authoring for Native/Built-In
Labeling will be added in the upcoming Current Channel within
2 weeks
Azure Information Protection client audit logs are now
available in Activity Explorer for existing AIP Analytics customers and
this functionality is in public preview. Azure Information Protection client audit logs are now
available in Activity Explorer for existing AIP Analytics
customers and this functionality is in public preview.
Microsoft announces the General Availability of the
Microsoft Data Loss Prevention Alerts Dashboard. This latest
addition in the Microsoft’s data loss prevention solution
provides customers with the ability to holistically investigate DLP policy
violations across:
The DLP on-premises scanner crawls on-premises data-at-rest
in file shares and SharePoint document libraries and folders for sensitive
items that, if leaked, would pose a risk to your organization or pose a
risk of compliance policy violation
This gives you the visibility and control you need to
ensure that sensitive items are used and protected properly, and to help
prevent risky behavior that might compromise them
You need to leverage the Scanner binaries from AIP UL
Client Version 2.10.43.0