Goal: AI/API real life NZT-48
1. Analyze live market data
2. Machine learn winning strategies
a. Bollinger strategy (for long)
b. Super trend strat (for day)
c. Trading strat ?
3. Compile and spit out calls and puts live
a. Eventually automate this task
Variables
1. Price
2. Volume
3. Gains on the day
4. Breaking news
5. Political releases
Accurate variables needed per stock
1. 50% up on the day
2. Volume of at least 750k
3. Minimum value of .1
4. Maximum value of 15
Useful stuff
NODES
1. Scraper node
2. LONG term data library node (vígið)
3. Short term data library node (hið sérstaka)
4. Custom data input API (used for inputting our strategy and what the machine needs to
look for) (dómari)
a. This is essentially how we tell the machine to make decisions. This will have a
variant that is smaller and used for short term data processing so we can do day
trades.
5. Output node to tell us our calls and puts and what stock
6. EVENTUAL attachment to Etrade to automate everything
DATA STRAT
1. The first set of data (connected to vígið) will be the biggest of them all, and will house
ALL the data from the last ~30 years and will be used to give data for a machine learning
and algorithmic practice.
a. The data collected will be
i. 1 - max year chart (an every interval in between possible, terms of 2 year
3 year etc etc)
ii. 1 day to 1 week charts (and every interval)
iii. 1 week to 1 month (every interval)
iv. 1 month to 12 month (every interval)
v. 1 hour to 24 hour (every interval)
vi. 1 minute to 1 hour (every interval)
vii. 1 sec to 1 minute (every interval)
b. The data will be (per interval per stock)
i. Open price
ii. Close price
iii. High price
iv. Low price
2. The stocks will be hand curated to cut down on time, as some stocks never took off, and
were always pink slip. The stocks recorded will be .01 cent to x dollars. All the stocks
however will be detailed in their entirety and will then be analyzed for understanding and
pattern determination.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Blocks
1. Create APIs
a. Install Flask
b. Create the simple API structure
c. Run
d. Test
e. Further Develop
2. Create data scraper
a. (scrape trading view.com)
b. Install dependencies
c. Simple scraper write and test
d. Dynamic content handler
i. Selenium
ii. Use trading view variables (aka 50% gains on day, under 15% dollars per
share etc etc) and attach that
e. Set up the scraper to seize ALL numerical data not Paragraph data
i. When it comes to breaking news it will scrape headlines and specific
number sets we need
f. Ensure we are not violating any rules.
g. Use selenium for automations purposes
i. Accepts python base code
h. Transfers to .json file
3. Set up Library API (vígið, hið sérstaka, and dómari)
a. Once data is received by the scraper have it requires to be sent t the library api
b. Transfer the data received to a simple .json file
c. Instal the requests onto the main python script
d. CREATE the API itself
e. Use either Cassandra or NoSQL
f. Set up data scheme
i. Snow flake scheme for vígið
ii. Star scheme for vígið sending to hið sérstaka and dómari
g. Get port to attach to the main processing daily processing interface unit