Artificial Intelligence 8 min read

How to Use TI-ONE Notebook for the 2020 Tencent Advertising Algorithm Competition

This tutorial walks through creating a TI-ONE Notebook instance, importing competition data, training a TensorFlow model, and uploading results, providing a complete end‑to‑end workflow for participants of the 2020 Tencent Advertising Algorithm Competition.

Tencent Advertising Technology
Tencent Advertising Technology
Tencent Advertising Technology
How to Use TI-ONE Notebook for the 2020 Tencent Advertising Algorithm Competition

In this tutorial we introduce the workflow for using the TI-ONE Notebook platform to participate in the 2020 Tencent Advertising Algorithm Competition; all data paths are examples and not official competition data.

1. Create an instance

Log in to the TI-ONE console, click the Notebook menu to view the instance list, then click New Instance and fill in the required fields:

Region : automatically set, cannot be changed.

Notebook name : name of the instance.

Resource selection : choose CPU/GPU resources (charges apply while running).

Volume size : disk size in GB (minimum 10 GB, multiple of 10, up to 16384 GB).

Root permission : grant admin rights to access all files.

VPC : optional custom VPC network.

Configuration price : displayed based on selected resources.

After clicking Create , refresh the list; when the status changes from “Creating” to “Running”, click Open to enter the notebook.

Inside the instance set the kernel environment (e.g., conda_tensorflow_py3 ).

2. Data import

Obtain the competition data COS path (example shown) and download it inside the notebook:

!pip install wget
import wget, tarfile
filename = wget.download("https://tesla-ap-guangzhou-1256322946.cos.ap-guangzhou.myqcloud.com/cephfs/tesla_common/deeplearning/dataset/contest/demo.zip")
print(filename)

import zipfile
zFile = zipfile.ZipFile(filename, "r")
for fileM in zFile.namelist():
    zFile.extract(fileM, "./")
    print(fileM)
zFile.close()

3. Model training

Write your own model code; an example using TensorFlow to train a DNN classifier on the Iris dataset is provided:

import pandas as pd
import tensorflow as tf

CSV_COLUMN_NAMES = ['SepalLength', 'SepalWidth', 'PetalLength', 'PetalWidth', 'Species']
SPECIES = ['Setosa', 'Versicolor', 'Virginica']

def load_train_data(train_path, y_name='Species'):
    """Returns the iris dataset as (train_x, train_y)"""
    train = pd.read_csv(train_path, names=CSV_COLUMN_NAMES, header=0)
    train_x, train_y = train, train.pop(y_name)
    return (train_x, train_y)

def load_test_data(test_path):
    """Returns the iris dataset as test_x"""
    test = pd.read_csv(test_path, names=CSV_COLUMN_NAMES[:-1], header=0)
    return test

def train_input_fn(features, labels, batch_size):
    """An input function for training"""
    dataset = tf.data.Dataset.from_tensor_slices((dict(features), labels))
    dataset = dataset.shuffle(1000).repeat().batch(batch_size)
    return dataset

def eval_input_fn(features, labels, batch_size):
    """An input function for evaluation or prediction"""
    inputs = (dict(features), labels) if labels is not None else dict(features)
    dataset = tf.data.Dataset.from_tensor_slices(inputs)
    assert batch_size is not None, "batch_size must not be None"
    dataset = dataset.batch(batch_size)
    return dataset

import os

def main():
    batch_size = 100
    train_steps = 1000
    (train_x, train_y) = load_train_data("iris_training.csv")
    my_feature_columns = []
    for key in train_x.keys():
        my_feature_columns.append(tf.feature_column.numeric_column(key=key))
    classifier = tf.estimator.DNNClassifier(
        feature_columns=my_feature_columns,
        hidden_units=[10, 10],
        n_classes=3)
    classifier.train(input_fn=lambda: train_input_fn(train_x, train_y, batch_size), steps=train_steps)
    test_x = load_test_data("iris_test.csv")
    predictions = classifier.predict(input_fn=lambda: eval_input_fn(test_x, labels=None, batch_size=batch_size))
    result = []
    template = 'Prediction is "{}" ({:.1f}%)'
    for pred_dict in predictions:
        class_id = pred_dict['class_ids'][0]
        probability = pred_dict['probabilities'][class_id]
        print(template.format(SPECIES[class_id], 100 * probability))
        result.append(class_id)
    result_df = pd.DataFrame(data=result)
    result_df.to_csv("result_file", index=False)
    print("result file is saved")
    
    main()

4. Result upload

Upload the generated result file to COS using the TI SDK:

from ti import session
ti_session = session.Session()
inputs = ti_session.upload_data(path="result_file", bucket="demo-project-ap-guangzhou-1259675134", key_prefix="contest")

After uploading, you can view the file in the specified COS bucket or download it via the COS console.

With these steps, you have completed the end‑to‑end process of using TI‑ONE Notebook to train a model and submit results for the competition.

TensorFlowmodel trainingTI-ONEdata-importnotebook
Tencent Advertising Technology
Written by

Tencent Advertising Technology

Official hub of Tencent Advertising Technology, sharing the team's latest cutting-edge achievements and advertising technology applications.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.