Information extraction from Madison city crime incident reports using Deep Learning

Introduction

Crime analysis is an essential part of efficient law enforcement for any city. It involves:

  • Collecting data in a form that can be analyzed.
  • Identifying spatial/non-spatial patterns and trends in the data.
  • Informed decision making based on the analysis.

In order to start the analysis, the first and foremost requirement is analyzable data. A huge volume of data is present in the witness and police narratives of the crime incident. Few examples of such information are:

  • Place of crime
  • Nature of crime
  • Date and time of crime
  • Suspect
  • Witness

Extracting such information from incident reports requires tedious work. Crime analysts have to sift through piles of police reports to gather and organize this information.

With recent advancements in Natural Language Processing and Deep learning, it's possible to devise an automated workflow to extract information from such unstructured text documents. In this notebook we will extract information from crime incident reports obtained from Madison police department [1]using arcgis.learn's EntityRecognizer class.

Prerequisites

  • Data preparation and model training workflows using arcgis.learn is based on spaCy & Hugging Face Transformers libraries. A user can choose an appropriate backbone / library to train his/her model.
  • Refer to the section "Install deep learning dependencies of arcgis.learn module" on this page for detailed documentation on installation of the dependencies.
  • Labelled data: In order for EntityRecognizer to learn, it needs to see examples that have been labelled for all the custom categories that the model is expected to extract. Labelled data for this sample notebook is located at data/EntityRecognizer/labelled_crime_reports.json.
  • To learn how to use Doccano[2] for labelling text, please see the guide on Labeling text using Doccano.
  • Test documents to extract named entities are in a zipped file at data/EntityRecognizer/reports.zip.
  • To learn more on how EntityRecognizer works, please see the guide on Named Entity Extraction Workflow with arcgis.learn.

Necessary Imports

import pandas as pd
import zipfile,unicodedata
from itertools import repeat
from pathlib import Path
from arcgis.gis import GIS
from arcgis.learn import prepare_textdata
from arcgis.learn.text import EntityRecognizer
from arcgis.geocoding import batch_geocode
import re
import os
import datetime
gis = GIS('home')

Data preparation

Data preparation involves splitting the data into training and validation sets, creating the necessary data structures for loading data into the model and so on. The prepare_data() function can directly read the training samples in one of the above specified formats and automate the entire process.

training_data = gis.content.get('b2a1f479202244e798800fe43e0c3803')
training_data
information-extraction-from-madison-city-crime-incident-reports-using-deep-learning
Image Collection by api_data_owner
Last Modified: August 26, 2020
0 comments, 64 views
filepath = training_data.download(file_name=training_data.name)
import zipfile
with zipfile.ZipFile(filepath, 'r') as zip_ref:
    zip_ref.extractall(Path(filepath).parent)
json_path = Path(os.path.join(os.path.splitext(filepath)[0] , 'labelled_crime_reports.json'))
data = prepare_textdata(path= json_path, task="entity_recognition", dataset_type='ner_json', class_mapping={'address_tag':'Address'})

The show_batch() method can be used to visualize the training samples, along with labels.

data.show_batch()
textAddressCrimeCrime_datetimeReported_dateReported_timeReporting_officerWeapon
0A McDonald's employee suffered a knee injury a...[Odana Rd. restaurant][strong-armed robbery, grabbed money][Monday night][01/19/2016][9:44 AM][PIO Joel Despain]
1A 13-year-old boy, who pointed a handgun at a ...[1500 block of Troy][disorderly conduct while armed][last night][10/12/2016][10:11 AM][PIO Joel Despain][handgun, pellet gun, BB or pellet gun]
2One man has been arrested and another is being...[intersection of E. Washington Ave. and N. Sto...[shooting, firing the gun][Sunday evening][01/04/2016][10:45 AM][BB gun, BB gun, BB gun]
3Several deli employees and a diner - who happe...[Stalzy's Deli, 2701 Atwood Ave.][burglary, stole money, stolen money][09/24/2018][9:59 AM][PIO Joel Despain]
4A Madison man was arrested Saturday inside Eas...[East Towne Mall][disturbance][05/09/2016][9:52 AM][PIO Joel Despain][handgun, BB gun]
5A knife-wielding man, who threatened a couple ...[State St., downtown][racial slurs and vulgarities, stab, yelling a...[Sunday afternoon][11/12/2018][10:02 AM][PIO Joel Despain][knife, knife]
6A MPD officer activated his squad car lights h...[E. Gorham St.][crash, intoxicated, drunken driving, driving ...[12/21/2018][11:29 AM][PIO Joel Despain]
7A suspected drug dealer attempted to destroy 5...[Monday, Sherman Ave. apartment,][destroy 50 grams of fentanyl laced heroin][12/04/2018][11:45 AM][PIO Joel Despain][handgun]

EntityRecognizer model

EntityRecognizer model in arcgis.learn can be used with spaCy's EntityRecognizer backbone or with Hugging Face Transformers backbones

Run the command below to see what backbones are supported for the entity recognition task.

print(EntityRecognizer.supported_backbones)
['spacy', 'BERT', 'RoBERTa', 'DistilBERT', 'ALBERT', 'CamemBERT', 'MobileBERT', 'XLNet', 'XLM', 'XLM-RoBERTa', 'FlauBERT', 'ELECTRA', 'Longformer']

Call the model's available_backbone_models() method with the backbone name to get the available models for that backbone. The call to available_backbone_models method will list out only few of the available models for each backbone. Visit this link to get a complete list of models for each of the transformer backbones. To know more choosing an appropriate transformer model for your dataset, visit this link

Note - Only a single model is available to train EntityRecognizer model with spaCy backbone

print(EntityRecognizer.available_backbone_models("spacy"))
('spacy',)

First we will create model using the EntityRecognizer() constructor and passing it the data object.

ner = EntityRecognizer(data, backbone="spacy")

Finding optimum learning rate

The learning rate[3] is a tuning parameter that determines the step size at each iteration while moving toward a minimum of a loss function, it represents the speed at which a machine learning model "learns". arcgis.learn includes learning rate finder, and is accessible through the model's lr_find() method, that can automatically select an optimum learning rate, without requiring repeated experiments.

lr = ner.lr_find()
<Figure size 432x288 with 1 Axes>

Model training

Training the model is an iterative process. We can train the model using its fit() method till the F1 score (maximum possible value = 1) continues to improve with each training pass, also known as epoch. This is indicative of the model getting better at predicting the correct labels.

ner.fit(epochs=30, lr=lr)
epochlossesval_lossprecision_scorerecall_scoref1_scoretime
075.5711.450.00.00.000:00:04
118.6710.350.90.090.1600:00:04
218.2810.190.90.370.5200:00:04
315.4923.080.50.250.3400:00:04
420.6617.260.370.110.1700:00:03
518.3141.410.560.360.4400:00:04
614.7230.00.460.250.3300:00:04
726.5431.750.060.010.0200:00:04
824.0119.760.00.00.000:00:04
923.5310.630.70.440.5400:00:04
1015.9412.130.640.430.5100:00:04
1116.089.380.720.490.5800:00:04
1213.9915.880.650.430.5100:00:04
1313.598.890.620.510.5600:00:04
1411.675.880.610.550.5800:00:04
1512.738.470.60.490.5300:00:04
1614.7220.340.690.510.5900:00:04
1742.7926.580.280.070.1100:00:05
1824.086.530.750.620.6800:00:04
1912.185.470.750.620.6800:00:04
2018.0918.480.750.670.7100:00:05
2113.416.170.820.820.8200:00:05
2211.685.910.810.80.800:00:04
2311.073.770.880.880.8800:00:04
2411.494.80.90.830.8600:00:04
2511.13.450.940.930.9400:00:04
268.871.920.950.940.9500:00:05
2710.122.690.960.960.9600:00:04
289.751.40.980.980.9800:00:04
298.972.90.970.970.9700:00:04

Evaluate model performance

Important metrics to look at while measuring the performance of the EntityRecognizer model are Precision, Recall & F1-measures [4].

ner.precision_score()
0.97
ner.recall_score()
0.97
ner.f1_score()
0.97

To find precision, recall & f1 scores per label/class we will call the model's metrics_per_label() method.

ner.metrics_per_label()
Precision_scoreRecall_scoreF1_score
Address1.001.001.00
Reporting_officer1.001.001.00
Reported_date1.000.930.97
Crime0.961.000.98
Crime_datetime1.001.001.00
Reported_time1.001.001.00
Weapon0.780.780.78

Validate results

Now we have the trained model, let's look at how the model performs.

ner.show_results()
100.00% [8/8 00:00<00:00]
TEXTFilenameAddressCrimeCrime_datetimeReported_dateReported_timeReporting_officerWeapon
0Madison Police responded at 22:10 to the 500 b...Example_0500 block of South Park Streetarmed robbery,rob22:1012/26/20175:39 AMSgt. Paul Jacobsen
1Officers responded to an alarm at Dick's Sport...Example_1Dick's Sporting Goods, 237 West Towne Mall3:37 AMLt. Timothy Radke13 airsoft and pellet guns, which appeared to ...
2The MPD arrested an 18-year-old man on a tenta...Example_2Memorial High Schooldisorderly conductafter 5:30 p.m. yesterday afternoon05/05/20171:55 PMPIO Joel Despain
3A convenience store clerk was robbed at gunpoi...Example_37-Eleven, 2703 W. Beltline Highwayrobbed at gunpoint12/14/20179:28 AMPIO Joel Despainweapon
4A convenience store clerk was robbed at gunpoi...Example_3south on Todd Dr.robbed at gunpoint12/14/20179:28 AMPIO Joel Despainweapon
5Madison police officers were dispatched to the...Example_4East Towne Malloverdosed on heroin,injecting heroin,possessio...02/26/20187:40 AMLt. Jason OstrengaSyringes,
6A Sun Prairie woman and her nine-year-old gran...Example_5E. Washington Ave.crash,drunken driver,hit-and-run,fifth offense...05/17/201710:38 AMPIO Joel Despain
7Madison police officers were dispatched to the...Example_6East Towne Malloverdosed on heroin,injecting heroin,possessio...02/26/20187:40 AMLt. Jason OstrengaSyringes,
8Victim reporting that he was pistol whipped in...Example_73400 block of N Sherman Ave09/18/20179:30 PMSgt. Rosemarie Mansavage

Save and load trained models

Once you are satisfied with the model, you can save it using the save() method. This creates an Esri Model Definition (EMD file) that can be used for inferencing on new data. Saved models can also be loaded back using the load() method. load() method takes the path to the emd file as a required argument.

ner.save('crime_model')

Model Inference

Now we can use the trained model to extract entities from new text documents using extract_entities() method. This method expects the folder path of where new text document are located, or a list of text documents.

reports = os.path.join(filepath.split('.')[0] , 'reports')
results = ner.extract_entities(reports) #extract_entities()also accepts path of the documents folder as an argument.
100.00% [1501/1501 00:12<00:00]
results.head()
TEXTFilenameAddressCrimeCrime_datetimeReported_dateReported_timeReporting_officerWeapon
0Officers were dispatched to a robbery of the A...0.txtAssociated Bank in the 1500 block of W Broadwayrobbery,demanded money08/09/20186:17 PMSgt. Jennifer Kane
1The MPD was called to Pink at West Towne Mall ...1.txtPink at West Towne Mallthefts atTuesday night08/18/201610:37 AMPIO Joel Despain
2The MPD is seeking help locating a unique $1,5...10.txtUnion St.stolen,thief cut,stolen08/17/201611:09 AMPIO Joel Despain
3A Radcliffe Drive resident said three men - at...100.txtRadcliffe Drivearmed robberyearly this morning08/07/201811:17 AMPIO Joel Despainhandguns
4Madison Police officers were near the intersec...1001.txtintersection of Francis Street and State Street08/10/20184:20 AMLt. Daniel Nalegunshot

Publishing the results as a feature layer

The code below geocodes the extracted address and publishes the results as a feature layer.

# This function generates x,y coordinates based on the extracted location from the model.

def geocode_locations(processed_df, city, region, address_col):
    #creating address with city and region
    add_miner = processed_df[address_col].apply(lambda x: x+f', {city} '+f', {region}') 
    chunk_size = 200
    chunks = len(processed_df[address_col])//chunk_size+1
    batch = list()
    for i in range(chunks):
        batch.extend(batch_geocode(list(add_miner.iloc[chunk_size*i:chunk_size*(i+1)])))
    batch_geo_codes = []
    for i,item in enumerate(batch):
        if isinstance(item,dict):
            if (item['score'] > 90 and 
                    item['address'] != f'{city}, {region}'
                    and item['attributes']['City'] == f'{city}'):
                batch_geo_codes.append(item['location'])
            else:
                batch_geo_codes.append('')    
        else:
            batch_geo_codes.append('') 
    processed_df['geo_codes'] = batch_geo_codes    
    return processed_df

#This function converts the dataframe to a spatailly enabled dataframe.

def prepare_sdf(processed_df):
    processed_df['geo_codes_x'] = 'x'
    processed_df['geo_codes_y'] = 'y'
    for i,geo_code in processed_df['geo_codes'].iteritems():
        if geo_code == '': 
            processed_df.drop(i, inplace=True) #dropping rows with empty location
        else:
            processed_df['geo_codes_x'].loc[i] = geo_code.get('x')
            processed_df['geo_codes_y'].loc[i] = geo_code.get('y')
    
    sdf = processed_df.reset_index(drop=True)
    sdf['geo_x_y'] = sdf['geo_codes_x'].astype('str') + ',' +sdf['geo_codes_y'].astype('str')
    sdf = pd.DataFrame.spatial.from_df(sdf, address_column='geo_x_y') #adding geometry to the dataframe
    sdf.drop(['geo_codes_x','geo_codes_y','geo_x_y','geo_codes'], axis=1, inplace=True) #dropping redundant columns
    return sdf
#This function will publish the spatical dataframe as a feature layer.

def publish_to_feature(df, gis, layer_title:str, tags:str, city:str, 
                       region:str, address_col:str):
    processed_df = geocode_locations(df, city, region, address_col)
    sdf = prepare_sdf(processed_df)
    try:        
        layer = sdf.spatial.to_featurelayer(layer_title, gis,tags) 
    except:
        layer = sdf.spatial.to_featurelayer(layer_title, gis, tags)

    return layer    
# This will take few minutes to run
madison_crime_layer = publish_to_feature(results, gis, layer_title='Madison_Crime' + str(datetime.datetime.now().microsecond), 
                                         tags='nlp,madison,crime', city='Madison', 
                                         region='WI', address_col='Address')
madison_crime_layer
Madison_Crime
Feature Layer Collection by arcgis_python
Last Modified: February 24, 2020
0 comments, 0 views

Visualize crime incident on map

result_map = gis.map('Madison, Wisconsin')
result_map.basemap = 'topographic'
result_map
result_map.add_layer(madison_crime_layer)

Create a hot spot map of crime densities

ArcGIS has a set of tools to help us identify, quantify and visualize spatial patterns in our data by identifying areas of statistically significant clusters.

The find_hot_spots tool allows us to visualize areas having such clusters.

from arcgis.features.analyze_patterns import find_hot_spots
crime_hotspots_madison = find_hot_spots(madison_crime_layer, 
                                        context={"extent":
                                                 {"xmin":-10091700.007046243,"ymin":5225939.095608932,
                                                  "xmax":-9731528.729766665,"ymax":5422840.88047145,
                                                  "spatialReference":{"wkid":102100,"latestWkid":3857}}},
                                        output_name="crime_hotspots_madison" + str(datetime.datetime.now().microsecond))
hotspot_map = gis.map('Madison, Wisconsin')
hotspot_map.basemap = 'terrain'
hotspot_map
hotspot_map.add_layer(crime_hotspots_madison)
hotspot_map.legend = True

Conclusion

This sample demonstrates how EntityRecognizer() from arcgis.learn can be used for information extraction from crime incident reports, which is an essential requirement for crime analysis. Then, we see how can this information be geocoded and visualized on a map for further analysis.

References

[1]: Police Incident Reports(City of Madison)

[2]: Doccano : text annotation tool for humans

[3]: Learning rate

[4]: Precision, recall and F1-measures

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.