AI is hard! Transforming tensors, cleaning data, building complicated networks- these are all specialized skills that can take months or even years to learn. But the times are a-changinβ. Businesses understand that adopting state-of-the-art AI is no longer a choice.
Of course, those burdened with actually implementing that AI are the software devs. High-level talk is great, but someone needs to get it done. The following tutorial outlines how using the Akkio API, in less than 40 lines of simple Python, you can train and run predictions against an advanced model.
To start we’ll need to install and import the Akkio Python library which wraps our API requests.
!pip install akkio import akkio
After logging in to Akkio, API keys are accessible on the team settings page. Also available here.
akkio.api_key = '12345678-abcde-pi3-1415926535' # your api key goes here
Next, import pandas and load the data frame with your data. In this example, we will use the review dataset, which contains review text which is labeled Positive or Negative.
import pandas as pd import time df = pd.read_csv("Restaurant_Reviews.csv") df
In order to train a model we first need to transfer the dataset into Akkio. This can be done by creating an empty dataset object, like so:
new_dataset = akkio.create_dataset('Restaurant_Reviews') ''' {'dataset_id': 'HVINbyLG1j85XYjcduYu', 'dataset_name': 'Restaurant_Reviews', 'status': 'success'} '''
before adding a schema, then rows. The API expects schema in the form of a list of dictionaries, each containing the field name and type: [{'name': 'field name 1', 'type': 'integer'}, {...}, ...]
(Valid types include: integer, float, text, category, date, id, unknown)
fields = [{'name': 'Review Text', 'type': 'text'}, {'name': 'Review', 'type': 'category'}] akkio.set_dataset_fields(new_dataset['dataset_id'], fields)
It expects the rows in the following format: [{'field name 1': 'value 1', 'field name 2': 0}, {...}, ...]
Since the dataset being used can sometimes be quite large, we chunk it in batches of 500 rows
chunk_size = 500 for i in range(0, len(df), chunk_size): rows = df[i:(i+chunk_size)].to_dict('records') akkio.add_rows_to_dataset(new_dataset['dataset_id'], rows)
Then we can create a model using the create_model
method, using the 'humor'
column as our target.
[ ] new_model = akkio.create_model(new_dataset['dataset_id'], ['Review'], [], {'duration': 3})
We can access all trained models using the API:
models = akkio.get_models()['models'] api_models = [] for model in models: if "(model)" in model['name']: api_models.append(model) api_models # [{'id': 'lh50m2ZepVB8eYuHPTsW', 'name': '(model) Restaurant_Reviews'}]
And choose our model from the list.
Then finally we can make predictions using our model using the make prediction method, which calls the trained model, encodes the input, and generates a new prediction.
model = api_models[0] prediction = akkio.make_prediction(model['id'], [{"Review Text": "Akkio"}], explain=False) print(prediction) # {'status': 'success', 'predictions': [{'Review': 'Positive', 'Probability Review is Negative': 0.14296989142894745, 'Probability Review is Positive': 0.8570300936698914}]}
As you can see, Akkio is an easy way to add Machine Learning to your Python project.
This is a guest article contributed from the Akkio team.