Hi,
I'm currently testing Toad Data Point and TIC to see if this solution can replace a really old BI platform we have (Hyperion 8.3).
Our users need to create automated queries and reports that will be sent after the daily datawarehouse load is complete. Is there a way to add a custom schedule frequency and trigger all schedules using that frequency manually?
Another example would be to trigger all automated fiscal period reports once a new fiscal period is loaded and complete.
Thanks
You may create query, reports and other activities within TDP(Toad Data Point) Automation. And then publish that Automation into TIC server and make it run as schedule. Now we only support time base schedule:
- Daily
- Weekly
- Monthly
- Custom
I am not sure is that may meet your requirement?
You said the external event to trigger schedule, may I know what kind of that external event is? Some kind of HTTP request?
These options are static. I'd like to be able to setup a dynamic schedule frequency.
For instance, the daily process that loads data into our Datawarehouse is generally completed by 8:00 AM but when issues are happening, sometimes it could complete only at 11:00 AM or even later. Users want to receive their reports as soon as possible, but only after this process is complete to ensure all the data is up-to-date.
With Hyperion, reports schedules are attached to a custom event named like "Daily load complete". This event is triggered by at the end our daily load process.
Another example is fiscal period. Our business has a fiscal calendar and many user reports or queries are based on that calendar. I'd like to add more dynamic schedule frequencies based on that fiscal calendar (ie: triggers all fiscal reports once a fiscal period is loaded and closed).
I don't know if this could be achieved with TIC. But with Hyperion, we only have to update a date time field on Hyperion internal database.
OK, we know more now. I am afraid we don't support dynamic schedule frequency now. We have provided sql and rest full api interface. If you have your own schedule which may run sql script directly or call rest full api, that may be an alternative. One of our customer do it like that.
I am still very interesting in the case of "Daily load complete". You said that custom event was triggered by daily load process. May I know what will happen once daily load process finishes? Some sql script will run, send some HTTP request, run some batch script or what's others?
For the fiscal period example, I think it's a good idea. We can support it with some enhancement. But that need to get approve from our PM first. I will raise the requirement to our PM.
Can you send me a link of your API documentation? And do you also have any documentation about the internal database of Intelligence Central?
"May I know what will happen once daily load process finishes? Some sql script will run, send some HTTP request, run some batch script or what's others?"
Nothing special happens except than sending the reports. Once the daily Datawarehouse ETL is complete, we just trigger the custom events we configured on the BI platform.
So the best would be:
- Having a page on TIC to define custom events
- When clicking on an automation script on TIC, under schedule, selecting the custom event
- From our ETL, trigger the custom event when needed (by any mean: TIC database update, http request, API call, etc).
- Then, TIC would run all automation scripts attached to the custom event just triggered.
We have build many custom events. Examples: Daily load complete, End of month after load, End of period after load, End of week after load, End of fiscal quarter after load, etc.
So, depending on the target time frame of their reports (daily, weekly, monthly, etc), We have a custom event for that. So for example, they don't need to know when the fiscal quarter ends, they just choose the custom event "End of fiscal quarter after load" and our custom event will take care of delivering the report at the right time.
Fine, I know that. I will think about how to implement it.
We do not have document for the internal database. It's not recommend to manipulate the database directly.
The REST API to run automation script is something like below:
// run script api, the folder parameter should start with '/'
URI: https://[host]:[port]/tic/v1/script/run
Parameter example:
{"scripts":[
{"folder": "/normal_folder/newfolder",
"script_names": ["SCRIPT_LOG"]
}
]}
Response example:
[
{
"message": null,
"object_name": "/normal_folder/newfolder/Script_log",
"object_type": "script",
"status": "RUNNING"
}
]
// get running script status api, the folder parameter should start with '/'. Multiple script names can be concat with ',' and set it as value of script_names.
URI: https://[host]:[port]/tic/v1/script/status?folder=%2fnormal_folder%2fnewfolder&script_names=SCRIPT_LOG
Response example:
[ { "message": null, "object_name": "/normal_folder/newfolder/Script_log", "object_type": "script", "status": "RUNNING" } ]
And the batch command to call the REST API like:
The curl in Windows cmd will like below, need to escape double quote:
curl -X POST -i -u root:123 -H "Content-Type:application/json" -k -d "{"scripts":[{"folder":"/normal_folder/newfolder","script_names":["script_log"]}]}" https://localhost:8066/tic/v1/script/run