Ingest MongoDB Items into Elasticsearch

I was working on a personal project where I needed some way to to save URL's / Bookmarks of websites that I come across that I would like to reference at a later time.

I eventually decided to develop a web app using Python Flask and MongoDB, which for me personally works great. (full code and demo can be found on GitHub )

The Problem:

After some time I had about 240 entries, which was all saved in MongoDB. I then decided to add a Search functionality within the app, and felt like Elasticsearch would be great.

The Solution:

There were a couple of great options out there to choose from, to ingest data from MongoDB to Elasticsearch, but as I am on a stage where I am strengthening my Python skills, I decided to write a Python Script that will connect to MongoDB and write each item into Elasticsearch.

Python Script:

import time
from pymongo import MongoClient
from elasticsearch import Elasticsearch

mongodb_client = MongoClient('mongodb://10.0.1.11:27017')
es_client = Elasticsearch(['http://10.0.1.12:9200'])

mdb = mongodb_client['mydb']

drop_index = es_client.indices.create(index='myindex', ignore=400)
create_index = es_client.indices.delete(index='myindex', ignore=[400, 404])

data = mdb.mycollection.find()

for x in data:
    _date = x['date']
    _type = x['type']
    _category = x['category']
    _description = x['description']
    _link = x['link']

    doc = {
        'date': _date,
        'type': _type,
        'category': _category,
        'description': _description,
        'link': _link
    }

    res = es_client.index(index="myindex", doc_type="docs", body=doc)
    time.sleep(0.2)

print("Done")

Any feedback on this will be appreciated, what do you guys think? Any other strategies?