Create Import Job for an existing table

You can create an import job to import data into a specified table.

REQUEST URI

https://<ZohoAnalytics_Server_URI>restapi/v2/bulk/workspaces/<workspace-id>/views/<view-id>/data

Post

oauthscopeZohoAnalytics.data.create

ACTION SPECIFIC PARAMETERS

 

Parameter Name
Description
FILE*

File
The file to be imported.

  • Format should be multipart/form-data. (This is the default format used by html forms that contain file type fields used for uploading files)
  • Maximum allowed file size is 100 MB.

QUERY PARAMETERS

 

Parameter Name
Description
CONFIG*JSONObject
Config parameter specifications are available in the below section.

FIELDS FOR CONFIG JSON

KeyDescription
importType*

String


Controls the type of import.

  • append - Appends the data into the table.
  • truncateadd - Deletes all exisiting rows in the table and adds the imported data as new entry.
  • updateadd - Updates the row if the mentioned column values are matched, else a new entry will be added.
fileType*

String


The format of the file to be imported.

  • csv
  • json
autoIdentify*

Boolean


To specify whether to auto identify the CSV format.

onError

String


Controls the action to be taken incase there is an error during import.

  • abort - Incase of any error, abort the whole import.
  • skiprow - In case of any error, skip that specific row(s) which has the problem and continue importing the rest.
  • setcolumnempty - In case of any error, set the value of the error column for the row to empty and continue importing.
matchingColumns
(mandatory only when the importType is updateadd)

JSONArray


The values in the columns to be matched will be used for comparison to check whether data row(s) being imported matches with an existing row(s) in the table.
The existing rows in the table that match will be updated with values from data imported. The remaining rows are appended to the table as new rows.

Sample: ["column1","column2"]

selectedColumns

JSONArray


Controls the columns that need to be imported.

Sample: ["column1","column2"]

skipTop

Integer


Number of rows that are to be skipped from the top in the CSV file being imported.

thousandSeparator

Integer


Controls the action to be taken in case there is a thousand separator in the data.

  • 0 - COMMA
  • 1 - DOT
  • 2 - SPACE
  • 3 - SINGLE QUOTE
decimalSeparator

Integer


Controls the action to be taken in case there is a decimal separator in the data.

  • 0 - DOT
  • 1 - COMMA
dateFormat

String


Specify this in-case any date field is being imported and its format cannot be auto recognized by Zoho Analytics.

Sample: dd-MMM-YYYY
Refer this link for more details about how to construct a custom date format.

columnDateFormat

JSONObject


Specify this in case multiple date fields are being imported having different format each.
Column name as key and date format as value.

Sample: {"columnName1":"dd-MMM-YYYY","columnName2":"MM/dd/yyyy"}

callbackUrl

String


A valid URL, that should allow HTTP Post method.
The Bulk Import Job's details is posted to this URL on successful completion of job or on failure of job.

matchNulls

Boolean


Controls whether rows with NULL values in matching columns should still be matched with existing rows during the import process using the UPDATEADD import type. 

By default, if a matching column contains a NULL value, the row is treated as a new entry and is not matched with any existing row. Setting matchNulls to true ensures that rows with NULL values in matching columns will be matched with existing rows that have the same non-NULL values for the other matching columns.

columnDataTypes

JSONArray


To specify the data type of the new columns while importing data.

Supported data types:

  • PLAIN
  • MULTI_LINE
  • EMAIL
  • NUMBER
  • POSITIVE_NUMBER
  • DECIMAL_NUMBER
  • CURRENCY
  • PERCENT
  • DATE
  • BOOLEAN
  • URL
  • AUTO_NUMBER
  • GEO

Sample: "columnDataTypes":[{"columnName":"Column 1","dataType":"NUMBER"},{"columnName":"Column 2","dataType":"PLAIN"},{"columnName":"Continent","dataType":"GEO","geoRole":"0"}]

CSV SPECIFIC ATTRIBUTES

Note: These attributes are mandatory if autoIdentify is set to false.

KeyDescription
commentCharChar
If the character mentioned is found at the beginning of the row, the csv row will be skipped.

Sample: #
delimiter

Integer
The delimiter character used for separating the fields in a row in the CSV.

  • 0 - COMMA
  • 1 - TAB
  • 2 - SEMICOLON
  • 3 - SPACE
quoted

Integer
The Text Qualifier.

  • 0 - NONE
  • 1 - SINGLE QUOTE
  • 2 - DOUBLE QUOTE

JSON SPECIFIC ATTRIBUTES

Note: These attributes are mandatory if autoIdentify is set to false.

KeyDescription
retainColumnNames

Boolean
Controls how the columns names are to be constructed from the JSON file.
Default value - false.

  • true - the final key attribute alone will be considered as column name.
  • false - the column name will be constructed by appending all the parent attributes separated by dot (.). This will result in column names which captures the full JSON tree hierarchy eg., employee.Name, employee.Department

POSSIBLE ERROR CODES

7103, 7111, 7138, 7232, 7248, 7249, 8046, 8119, 8504, 8506, 8513, 8516

Sample Request

Copiedcurl https://analyticsapi.zoho.com/restapi/v2/bulk/workspaces/<workspace-id>/views/<view-id>/data?CONFIG=<encoded_json_value> 
-F 'FILE=@/home/local/import.csv' 
-X 'POST' 
-H 'ZANALYTICS-ORGID: <org-id>' 
-H 'Authorization: Zoho-oauthtoken <access_token>'
Copiedusing System;
using System.Collections.Generic;
using ZohoAnalytics;
using System.Text.Json;

namespace ZohoAnalyticsTest
{
    class Program
    {
        long orgId = 55522777;
        long workspaceId = 35130000001055707;
        long viewId = 35730000007354002;

        public void ImportBulkData(IAnalyticsClient ac)
        {
            string importType = "append";
            string fileType = "json";
            bool autoIdentify = true;
            string filePath = "D:\\sales_v2.json";
            IBulkAPI data = ac.GetBulkInstance(orgId, workspaceId);
            long jobId = data.ImportBulkData(viewId, importType, fileType, autoIdentify, filePath, null);
            Console.WriteLine(jobId);
        }

        static void Main(string[] args)
        {
            string clientId = "1000.xxxxxxx";
            string clientSecret = "xxxxxxx";
            string refreshToken = "1000.xxxxxxx.xxxxxxx";

            try
            {
                IAnalyticsClient ac = new AnalyticsClient(clientId, clientSecret, refreshToken);
                Program obj = new Program();
                obj.ImportBulkData(ac);
            }
            catch (ServerException ex)
            {
                Console.WriteLine("Server exception - " + ex.GetErrorMessage());
            }
            catch (Exception ex)
            {
                Console.WriteLine("Other exception - " + ex.Message);
            }
        }
    }
}
Copiedpackage main

import (
    "fmt"
    ZAnalytics "zoho/pkg/analyticsclient"
)

var(
    clientId = "1000.xxxxxxx"
    clientSecret = "xxxxxxx"
    refreshToken = "1000.xxxxxxx.xxxxxxx"

    orgId = "55522777"
    workspaceId = "35130000001055707"
    viewId = "35730000007354002"
)

func ImportBulkData(ac ZAnalytics.Client) {
    config := map[string]interface{}{}
    importtype := "truncateadd"
    filetype := "csv"
    autoidentify := "true"
    filepath := "/home/local/admin/Files/Sales.csv"
    bulk := ZAnalytics.GetBulkInstance(&ac, orgId, workspaceId)
    result, err := bulk.ImportBulkData(viewId, importtype, filetype, autoidentify, filepath, config)

    if(err != nil){
        fmt.Println("Error - " + err.ErrorMessage)
    }else{
        fmt.Println(result)
    }
}

func main() {

    ac := ZAnalytics.GetAnalyticsClient(clientId, clientSecret, refreshToken)
    ImportBulkData(ac)

}
Copiedimport com.zoho.analytics.client.*;
import org.json.*;

public class Test {

    private long orgId = 55522777l;
    private long workspaceId = 35130000001055707l;
    private long viewId = 35730000007354002l;

    public static void main(String args[]){

        String clientId = "1000.xxxxxxx";
        String clientSecret = "xxxxxxx";
        String refreshToken = "1000.xxxxxxx.xxxxxxx";

        Test tObj = new Test();
        AnalyticsClient ac = new AnalyticsClient(clientId, clientSecret, refreshToken);

        try {
            tObj.importBulkData(ac);
        }
        catch (ServerException ex) {
            System.out.println("Server exception - ErrorCode : " + ex.getErrorCode() + ", ErrorMessage : "  + ex.getErrorMessage());
        }
        catch (ParseException ex) {
            System.out.println("Parser exception - ErrorMessage : "  + ex.getResponseMessage());
        }
        catch (Exception ex) {
            System.out.println("Other exception - ");
            ex.printStackTrace();
        }
    }

    public void importBulkData(AnalyticsClient ac) throws Exception {
        String importType = "TRUNCATEADD";
        String fileType = "csv";
        boolean autoIdentify = true;
        String filePath = "/home/local/admin/Files/Sales.csv";
        JSONObject config = new JSONObject();
        BulkAPI data = ac.getBulkInstance(orgId, workspaceId);
        long jobId = data.importBulkData(viewId, importType, fileType, autoIdentify, filePath, config);
        System.out.println(jobId);
    }
}
Copied<?php

    require 'AnalyticsClient.php';

    class Test
    {
        public $ac = NULL;
        public $client_id = "1000.xxxxxxx";
        public $client_secret = "xxxxxxx";
        public $refresh_token = "1000.xxxxxxx.xxxxxxx";

        public $org_id = "55522777";
        public $workspace_id = "35130000001055707";
        public $view_id = "35730000007354002";

        function __construct() {
            $this->ac =  new AnalyticsClient($this->client_id, $this->client_secret, $this->refresh_token);
        }

        function importBulkData() {
            $import_type = "truncateadd";
            $file_type = "csv";
            $auto_identify = "true";
            $file_path = "/home/local/admin/Files/Sales.csv";
            $bulk = $this->ac->getBulkInstance($this->org_id, $this->workspace_id);
            $response = $bulk->importBulkData($this->view_id, $import_type, $file_type, $auto_identify, $file_path);
            print_r($response);
        }
    }

    $test_obj = new Test();

    try {
        $test_obj->importBulkData();
    }
    catch(ServerException $se) {
        echo "Server exception : " . $se->getErrorMessage() . "\n";
    }
    catch(IOException $ioe) {
        echo "IO exception : " . $ioe->getErrorMessage() . "\n";
    }
    catch(ParseException $pe) {
        echo "Parser exception : " . $pe->getErrorMessage() . "\n";
    }
    catch(Exception $e) {
        echo "Exception : " . $e->getErrorMessage() . "\n";
    }
?>
Copiedfrom __future__ import with_statement
from AnalyticsClient import AnalyticsClient
import sys
import json

class Config:

    CLIENTID = "1000.xxxxxxx";
    CLIENTSECRET = "xxxxxxx";
    REFRESHTOKEN = "1000.xxxxxxx.xxxxxxx";

    ORGID = "55522777";
    WORKSPACEID = "35130000001055707";
    VIEWID = "35730000007354002";

class sample:

    ac = AnalyticsClient(Config.CLIENTID, Config.CLIENTSECRET, Config.REFRESHTOKEN)

    def import_bulk_data(self, ac):
        import_type = "truncateadd"
        file_type = "csv"
        auto_identify = "true"
        file_path = "/home/local/admin/Files/Sales.csv"
        bulk = ac.get_bulk_instance(Config.ORGID, Config.WORKSPACEID)
        result = bulk.import_bulk_data(Config.VIEWID, import_type, file_type, auto_identify, file_path)        
        print(result)

try:
    obj = sample()
    obj.import_bulk_data(obj.ac);

except Exception as e:
    print(str(e))
Copiedvar analyticsClient = require('./AnalyticsClient');

var clientId = '1000.xxxxxxx';
var clientSecret = 'xxxxxxx';
var refreshtoken = '1000.xxxxxxx.xxxxxxx';
var orgId = '55522777';
var workspaceId = '35130000001055707';
var viewId = '35730000007354002';

var ac = new analyticsClient(clientId, clientSecret, refreshtoken);

var importType = 'append';
var fileType = 'csv';
var autoIdentify = true;
var filePath = '/home/local/admin/Files/Sales.csv';
var bulk = ac.getBulkInstance(orgId, workspaceId);
bulk.importBulkData(viewId, importType, fileType, autoIdentify, filePath).then((response) => {
    console.log(response);

}).catch((error) => {
    console.log('errorCode : '+error.errorCode);
    console.log('errorMessage : '+error.errorMessage);
});
CopiedorgId = "55522777";
workspaceId = "35130000001055707";
viewId = "35730000007354002";

headersMap = Map();
headersMap.put("ZANALYTICS-ORGID",orgId);
fileResponse = invokeurl
[
  url :""
  type :GET
];
file = fileResponse.toFile("Sample.csv");
file.setParamName("FILE");
config = Map();
config.put("importType","append");
config.put("fileType","csv");
config.put("autoIdentify","true");
parameters = "CONFIG=" + zoho.encryption.urlEncode(config.toString());
response = invokeurl
[
  url :"https://analyticsapi.zoho.com/restapi/v2/bulk/workspaces/" + workspaceId + "/views/" + viewId + "/data" + "?" + parameters
  type :POST
  headers:headersMap
  files:file
  connection:"analytics_oauth_connection"
];
info response;

Download client libraries:  C# | GO | JAVA | PHP | PYTHON | NodeJS

Sample value for CONFIG parameter:

Copied{
"importType":"append",
"fileType":"csv",
"autoIdentify":"true"
}

Sample Response

CopiedHTTP/1.1 200 OK
Content-Type:application/json;charset=UTF-8

{
    "status": "success",
    "summary": "Create bulk import job",
    "data": {
        "jobId": "1757024000003153087"
    }
}