Developer-first vector platform for ML teams
π Advanced Search#
Fast Search is Relevance AIβs most complex search endpoint. It combines functionality to search using vectors, exact text search with ability to boost your search results depending on your needs. The following demonstrates a few dummy examples on how to quickly add complexity to your search!
!pip install -q -U RelevanceAI-dev[notebook]
[K |ββββββββββββββββββββββββββββββββ| 299 kB 16.0 MB/s
[K |ββββββββββββββββββββββββββββββββ| 1.1 MB 61.6 MB/s
[K |ββββββββββββββββββββββββββββββββ| 253 kB 53.2 MB/s
[K |ββββββββββββββββββββββββββββββββ| 58 kB 6.5 MB/s
[K |ββββββββββββββββββββββββββββββββ| 144 kB 49.1 MB/s
[K |ββββββββββββββββββββββββββββββββ| 94 kB 3.1 MB/s
[K |ββββββββββββββββββββββββββββββββ| 271 kB 48.8 MB/s
[K |ββββββββββββββββββββββββββββββββ| 112 kB 47.4 MB/s
[?25h Building wheel for fuzzysearch (setup.py) ... [?25l[?25hdone
## Let's use this CLIP popular model to encode text and image into same space https://github.com/openai/CLIP
%%capture
!conda install --yes -c pytorch pytorch=1.7.1 torchvision cudatoolkit=11.0
!pip install ftfy regex tqdm
!pip install git+https://github.com/openai/CLIP.git
You can sign up/login and find your credentials here:
https://cloud.tryrelevance.com/sdk/api Once you have signed up, click on the
value under Authorization token
and paste it here
import pandas as pd
from relevanceai import Client
client = Client()
Activation Token: Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·
π£ Inserting data#
We use a sample ecommerce dataset - with vectors
product_image_clip_vector_
and product_title_clip_vector_
already encoded for us.
from relevanceai.utils.datasets import get_ecommerce_dataset_encoded
docs = get_ecommerce_dataset_encoded()
ds = client.Dataset("advanced_search_guide")
# ds.delete()
ds.upsert_documents(docs)
β
All documents inserted/edited successfully.
ds.schema
{'insert_date_': 'date',
'price': 'numeric',
'product_image': 'text',
'product_image_clip_vector_': {'vector': 512},
'product_link': 'text',
'product_price': 'text',
'product_title': 'text',
'product_title_clip_vector_': {'vector': 512},
'query': 'text',
'source': 'text'}
vector_fields = ds.list_vector_fields()
vector_fields
['product_image_clip_vector_', 'product_title_clip_vector_']
Simple Text Search#
results = ds.advanced_search(
query="nike", fields_to_search=["product_title"], select_fields=["product_title"]
)
pd.DataFrame(results["results"])
product_title | _id | _relevance | |
---|---|---|---|
0 | Nike Women's Summerlite Golf Glove | b37b2aea-800e-4662-8977-198f744d52bb | 7.590130 |
1 | Nike Dura Feel Women's Golf Glove | e725c79c-c2d2-4c6d-b77a-ed029f33813b | 7.148285 |
2 | Nike Junior's Range Jr Golf Shoes | 0e7a5a3d-5d17-42c4-b607-7bf9bb2625a4 | 7.148285 |
3 | Nike Sport Lite Women's Golf Bag | 3660e25b-8359-49b9-88c7-fca2dfd9053f | 7.148285 |
4 | Nike Women's Tech Xtreme Golf Glove | 8b28e438-0726-4b58-98c7-7597a43d2433 | 7.148285 |
5 | Nike Women's SQ Dymo Fairway Wood | adab23fd-ded8-4068-b6a2-999bfe20e5e7 | 7.148285 |
6 | Nike Ladies Lunar Duet Sport Golf Shoes | b655198b-4356-4ba9-b88e-1e1d6608f43e | 6.755055 |
7 | Nike Junior's Range Red/ White Golf Shoes | d27e70f3-2884-4490-9742-133166795d0f | 6.755055 |
8 | Nike Women's Lunar Duet Classic Golf Shoes | e1f3faf0-72fa-4559-9604-694699426cc2 | 6.755055 |
9 | Nike Air Men's Range WP Golf Shoes | e8d2552f-3ca5-4d15-9ca7-86855025b183 | 6.755055 |
Simple Vector Search#
Letβs prepare some functions to help us encode our data!
import torch
import clip
import requests
from PIL import Image
device = "cuda" if torch.cuda.is_available() else "cpu"
model, preprocess = clip.load("ViT-B/32", device=device)
# First - let's encode the image based on CLIP
def encode_image(image):
# Let us download the image and then preprocess it
image = (
preprocess(Image.open(requests.get(image, stream=True).raw))
.unsqueeze(0)
.to(device)
)
# We then feed our processed image through the neural net to get a vector
with torch.no_grad():
image_features = model.encode_image(image)
# Lastly we convert it to a list so that we can send it through the SDK
return image_features.tolist()[0]
# Next - let's encode text based on CLIP
def encode_text(text):
# let us get text and then tokenize it
text = clip.tokenize([text]).to(device)
# We then feed our processed text through the neural net to get a vector
with torch.no_grad():
text_features = model.encode_text(text)
return text_features.tolist()[0]
100%|βββββββββββββββββββββββββββββββββββββββ| 338M/338M [00:06<00:00, 52.1MiB/s]
# Encoding the query
query_vector = encode_text("nike")
results = ds.advanced_search(
vector_search_query=[
{"vector": query_vector, "field": "product_title_clip_vector_"}
],
select_fields=["product_title"],
)
pd.DataFrame(results["results"])
product_title | _id | _relevance | |
---|---|---|---|
0 | PS4 - Playstation 4 Console | a24c46df-0a1b-49a5-80f4-5ad61bcc6370 | 0.748447 |
1 | Nike Men's 'Air Visi Pro IV' Synthetic Athleti... | 0435795a-899f-4cdf-89be-a0f3f189d69e | 0.747137 |
2 | Nike Men's 'Air Max Pillar' Synthetic Athletic... | 57ca8324-3e8a-4926-9333-b10599edb17b | 0.733907 |
3 | Brica Drink Pod | bbb623f6-485b-44b3-8739-1998b15ae60d | 0.725095 |
4 | Gear Head Mouse | c945fe93-fff3-434b-a91f-18133ab28582 | 0.712708 |
5 | Gear Head Mouse | 0f1e86a8-867f-4437-8fb0-2b95a37f0c22 | 0.712708 |
6 | PS4 - UFC | 050a9f63-3549-4720-9be7-9daa07f868e8 | 0.702847 |
7 | Nike Women's 'Zoom Hyperquickness' Synthetic A... | 5536a97a-2183-4342-bc92-422aebbcbbc9 | 0.697779 |
8 | Nike Women's 'Zoom Hyperquickness' Synthetic A... | 00445000-a8ed-4523-b610-f70aa79d47f7 | 0.695003 |
9 | Nike Men's 'Jordan SC-3' Leather Athletic Shoe | 281d9edd-4be6-4c69-a846-502053f3d4e7 | 0.694744 |
Combining Text And Vector Search (Hybrid)#
Combining text and vector search allows users get the best of both exact text search and contextual vector search. This can be done as shown below.
results = ds.advanced_search(
query="nike",
fields_to_search=["product_title"],
vector_search_query=[
{"vector": query_vector, "field": "product_title_clip_vector_"}
],
select_fields=["product_title"], # results to return
)
pd.DataFrame(results["results"])
product_title | _id | _relevance | |
---|---|---|---|
0 | Nike Women's Summerlite Golf Glove | b37b2aea-800e-4662-8977-198f744d52bb | 8.140370 |
1 | Nike Junior's Range Jr Golf Shoes | 0e7a5a3d-5d17-42c4-b607-7bf9bb2625a4 | 7.816567 |
2 | Nike Sport Lite Women's Golf Bag | 3660e25b-8359-49b9-88c7-fca2dfd9053f | 7.704053 |
3 | Nike Women's SQ Dymo Fairway Wood | adab23fd-ded8-4068-b6a2-999bfe20e5e7 | 7.700504 |
4 | Nike Dura Feel Women's Golf Glove | e725c79c-c2d2-4c6d-b77a-ed029f33813b | 7.696908 |
5 | Nike Women's Tech Xtreme Golf Glove | 8b28e438-0726-4b58-98c7-7597a43d2433 | 7.643136 |
6 | Nike Men's 'Lunarglide 6' Synthetic Athletic Shoe | 8cb26a3e-7de4-4af3-ae40-272450fa9b4d | 7.445704 |
7 | Nike Men's 'Lunarglide 6' Synthetic Athletic Shoe | 968a9319-fdd4-45ca-adc6-940cd83a204a | 7.440268 |
8 | Nike Women's SQ Dymo STR8-FIT Driver | ff52b64a-0567-4181-8753-763da7044f2f | 7.410513 |
9 | Nike Women's 'Lunaracer+ 3' Mesh Athletic Shoe | 0614f0a9-adcb-4c6c-939c-e7869525549c | 7.408814 |
Adjust the weighting of your vector search results#
Adjust the weighting of your vector search results to make it easier for
you! Simply add a weight
parameter your dictionary inside
vector_search_query
.
results = ds.advanced_search(
query="nike",
fields_to_search=["product_title"],
vector_search_query=[
{"vector": query_vector, "field": "product_title_clip_vector_", "weight": 0.5}
],
select_fields=["product_title"], # results to return
)
pd.DataFrame(results["results"])
product_title | _id | _relevance | |
---|---|---|---|
0 | Nike Women's Summerlite Golf Glove | b37b2aea-800e-4662-8977-198f744d52bb | 7.865250 |
1 | Nike Junior's Range Jr Golf Shoes | 0e7a5a3d-5d17-42c4-b607-7bf9bb2625a4 | 7.482427 |
2 | Nike Sport Lite Women's Golf Bag | 3660e25b-8359-49b9-88c7-fca2dfd9053f | 7.426169 |
3 | Nike Women's SQ Dymo Fairway Wood | adab23fd-ded8-4068-b6a2-999bfe20e5e7 | 7.424395 |
4 | Nike Dura Feel Women's Golf Glove | e725c79c-c2d2-4c6d-b77a-ed029f33813b | 7.422597 |
5 | Nike Women's Tech Xtreme Golf Glove | 8b28e438-0726-4b58-98c7-7597a43d2433 | 7.395711 |
6 | Nike Men's 'Lunarglide 6' Synthetic Athletic Shoe | 8cb26a3e-7de4-4af3-ae40-272450fa9b4d | 7.100379 |
7 | Nike Men's 'Lunarglide 6' Synthetic Athletic Shoe | 968a9319-fdd4-45ca-adc6-940cd83a204a | 7.097662 |
8 | Nike Women's SQ Dymo STR8-FIT Driver | ff52b64a-0567-4181-8753-763da7044f2f | 7.082784 |
9 | Nike Women's 'Lunaracer+ 3' Mesh Athletic Shoe | 0614f0a9-adcb-4c6c-939c-e7869525549c | 7.081935 |
Multi-Vector Search Across Multiple Fields#
You can easily add more to your search by extending your vector search query as belows.
from PIL import Image
import requests
import numpy as np
image_url = "https://static.nike.com/a/images/t_PDP_1280_v1/f_auto,q_auto:eco/e6ea66d1-fd36-4436-bcac-72ed14d8308d/wearallday-younger-shoes-5bnMmp.png"
Sample Query Image
from relevanceai import show_json
image_vector = encode_image(image_url)
results = ds.advanced_search(
query="nike",
fields_to_search=["product_title"],
vector_search_query=[
{"vector": query_vector, "field": "product_title_clip_vector_", "weight": 0.2},
{
"vector": image_vector,
"field": "product_image_clip_vector_",
"weight": 0.8,
}, ## weight the query more on the image vector
],
select_fields=[
"product_title",
"product_image",
"query",
"product_price",
], # results to return
queryConfig={"weight": 0.1} # Adjust the weight of the traditional configuration
)
display(
show_json(
results["results"],
text_fields=["product_title", "query", "product_price"],
image_fields=["product_image"],
)
)
# pd.DataFrame(results['results'])
product_image | product_title | query | product_price | _id | |
---|---|---|---|---|---|
0 | ![]() |
Nike Men's 'Lunarglide 6' Synthetic Athletic Shoe | nike womens | $145.99 | 8cb26a3e-7de4-4af3-ae40-272450fa9b4d |
1 | ![]() |
Nike Men's 'Lunarglide 6' Synthetic Athletic Shoe | nike shoes | $145.99 | 968a9319-fdd4-45ca-adc6-940cd83a204a |
2 | ![]() |
Nike Junior's Range Jr Golf Shoes | nike shoes | $54.99 | 0e7a5a3d-5d17-42c4-b607-7bf9bb2625a4 |
3 | ![]() |
Nike Ladies Lunar Duet Sport Golf Shoes | nike womens | $81.99 - $88.07 | 80210247-6f40-45be-8279-8743b327f1dc |
4 | ![]() |
Nike Mens Lunar Mont Royal Spikeless Golf Shoes | nike shoes | $100.99 | e692a73b-a144-4e44-b4db-657be6db96e2 |
5 | ![]() |
Nike Mens Lunar Cypress Spikeless Golf Shoes | nike shoes | $100.99 | fb323476-a16d-439c-9380-0bac1e10a06d |
6 | ![]() |
Nike Ladies Lunar Duet Sport Golf Shoes | nike shoes | $81.99 - $88.07 | b655198b-4356-4ba9-b88e-1e1d6608f43e |
7 | ![]() |
Nike Women's 'Lunaracer+ 3' Mesh Athletic Shoe | nike shoes | $107.99 | 0614f0a9-adcb-4c6c-939c-e7869525549c |
8 | ![]() |
Nike Women's 'Lunaracer+ 3' Mesh Athletic Shoe | nike womens | $107.99 | 7baea34f-fb0a-47da-9edd-d920abddccf5 |
9 | ![]() |
Nike Air Men's Range WP Golf Shoes | nike shoes | $90.99 - $91.04 | e8d2552f-3ca5-4d15-9ca7-86855025b183 |
Chunk Search Guide#
Chunk search allows users to search within a a chunk field. Chunk search allows users to search more fine-grained. A sample chunk search query is shown below.
from relevanceai import mock_documents
documents = mock_documents()
ds = client.Dataset("mock_dataset")
ds.upsert_documents(documents)
β
All documents inserted/edited successfully.
ds.schema
{'_chunk_': 'chunks',
'_chunk_.label': 'text',
'_chunk_.label_chunkvector_': {'chunkvector': 5},
'insert_date_': 'date',
'sample_1_description': 'text',
'sample_1_label': 'text',
'sample_1_value': 'numeric',
'sample_1_vector_': {'vector': 5},
'sample_2_description': 'text',
'sample_2_label': 'text',
'sample_2_value': 'numeric',
'sample_2_vector_': {'vector': 5},
'sample_3_description': 'text',
'sample_3_label': 'text',
'sample_3_value': 'numeric',
'sample_3_vector_': {'vector': 5}}
# Provide a chunk search
ds.advanced_search(
vector_search_query=[
{
"vector": [1, 1, 1, 1, 1],
"field": "label_chunkvector_",
"weight": 1,
"chunkConfig": {
"chunkField": "_chunk_",
"page": 0,
# the number of chunk results to return
# - stored in `_chunk_results` key
"pageSize": 3
}
},
],
)
{'afterId': [],
'aggregateStats': {},
'aggregates': {},
'aggregations': {},
'results': [{'_chunk_': [{'label': 'label_1',
'label_chunkvector_': [0.9714655321220234,
0.7128316097400133,
0.6781037943929558,
0.6488623491829022,
0.775330428892935]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_1'}]}},
'_id': '0fba3159-44ed-3303-ae3e-8763af736d82',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': 'WRZGB',
'sample_1_label': 'label_1',
'sample_1_value': 95,
'sample_1_vector_': [0.010111141119929168,
0.8100269908459344,
0.8450143601010813,
0.5200637988452348,
0.6807143398905711],
'sample_2_description': '27MA4',
'sample_2_label': 'label_2',
'sample_2_value': 62,
'sample_2_vector_': [0.8158557111159398,
0.7079708018800909,
0.040442267483184136,
0.2550053832057586,
0.6655286701296413],
'sample_3_description': '1NJGR',
'sample_3_label': 'label_0',
'sample_3_value': 16,
'sample_3_vector_': [0.8319698111146892,
0.2970554960820262,
0.7053962091476822,
0.7616721137875679,
0.33539644279489944]},
{'_chunk_': [{'label': 'label_2',
'label_chunkvector_': [0.17573371062486798,
0.557943855238517,
0.697754222989297,
0.9786125118059382,
0.7922094154419312]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_2'}]}},
'_id': '51a2eb0f-94c6-3035-89b0-027b2379b3d7',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': 'MH2FZ',
'sample_1_label': 'label_2',
'sample_1_value': 4,
'sample_1_vector_': [0.13122902838701556,
0.3479630189944891,
0.7020069274564608,
0.28257296541486776,
0.15930197109337352],
'sample_2_description': 'KU20B',
'sample_2_label': 'label_0',
'sample_2_value': 58,
'sample_2_vector_': [0.20753358564393043,
0.7285124067578301,
0.9003748477567735,
0.912483293611922,
0.23245362499843847],
'sample_3_description': '62ZTY',
'sample_3_label': 'label_5',
'sample_3_value': 98,
'sample_3_vector_': [0.6187077877648824,
0.4248041940356846,
0.48710139974254263,
0.769860649556282,
0.5785388950443682]},
{'_chunk_': [{'label': 'label_4',
'label_chunkvector_': [0.7338702708046809,
0.41755372242176314,
0.4912010324442426,
0.0834347624193984,
0.48279406238186817]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_4'}]}},
'_id': 'b56dd78b-0c29-3a00-8c6d-387655ca0a2b',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': 'VPYRN',
'sample_1_label': 'label_3',
'sample_1_value': 13,
'sample_1_vector_': [0.3828314864256408,
0.36459459507507885,
0.8940227989713352,
0.8794642161978363,
0.9682486851016051],
'sample_2_description': 'ZYE1X',
'sample_2_label': 'label_5',
'sample_2_value': 66,
'sample_2_vector_': [0.12136689372267317,
0.462037834296147,
0.5120688870564564,
0.38689918710131,
0.2805130330014971],
'sample_3_description': 'TCC27',
'sample_3_label': 'label_3',
'sample_3_value': 19,
'sample_3_vector_': [0.09914254554709134,
0.920167083569516,
0.11868940231964686,
0.5438045792718624,
0.43635676728310124]},
{'_chunk_': [{'label': 'label_5',
'label_chunkvector_': [0.33436906373438624,
0.5380728845974861,
0.23972813094355927,
0.7919330405084691,
0.2878108785508634]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_5'}]}},
'_id': '086151be-e3e0-3c74-ace7-6292246f0fc9',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': '2COZY',
'sample_1_label': 'label_2',
'sample_1_value': 96,
'sample_1_vector_': [0.7147183445018557,
0.18066520347080173,
0.9740064235203669,
0.6258224799724947,
0.3500929889622264],
'sample_2_description': 'H638P',
'sample_2_label': 'label_4',
'sample_2_value': 16,
'sample_2_vector_': [0.9450798492356538,
0.4462449289257341,
0.004355001860774199,
0.25486874541800486,
0.3482060493985143],
'sample_3_description': 'F2T24',
'sample_3_label': 'label_2',
'sample_3_value': 15,
'sample_3_vector_': [0.14630374114623268,
0.12238406234925325,
0.5542096939075382,
0.0475748252915158,
0.41292937921919615]},
{'_chunk_': [{'label': 'label_2',
'label_chunkvector_': [0.9465667341769131,
0.8306490761371044,
0.06366580368540398,
0.4169022757966413,
0.2879497402145924]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_2'}]}},
'_id': '32abd915-60b5-373c-825b-22ba2b7e01bf',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': '5GFR7',
'sample_1_label': 'label_5',
'sample_1_value': 0,
'sample_1_vector_': [0.03266482919634517,
0.2184525410362036,
0.4272720912279113,
0.735584738472561,
0.16534557670923755],
'sample_2_description': '4F95A',
'sample_2_label': 'label_2',
'sample_2_value': 50,
'sample_2_vector_': [0.5666292182319911,
0.045574402067497854,
0.20808912259919377,
0.41197652736153034,
0.9622611439423331],
'sample_3_description': '50JO8',
'sample_3_label': 'label_4',
'sample_3_value': 91,
'sample_3_vector_': [0.8349167635041148,
0.9909929540761643,
0.36585325598630203,
0.635433668522285,
0.28632200528034224]},
{'_chunk_': [{'label': 'label_0',
'label_chunkvector_': [0.04643479539353512,
0.832710978356411,
0.27875623750147294,
0.4913456773422803,
0.5388430545812762]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_0'}]}},
'_id': 'c4ba6213-c9de-31e7-8102-b6078cecfeaf',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': '2GTHL',
'sample_1_label': 'label_1',
'sample_1_value': 80,
'sample_1_vector_': [0.4934065280893306,
0.599044030362021,
0.23000529514903578,
0.35262850141097246,
0.447190046367118],
'sample_2_description': 'Y7DEF',
'sample_2_label': 'label_0',
'sample_2_value': 60,
'sample_2_vector_': [0.41032731851307735,
0.11788099018533249,
0.6375475627332368,
0.27037361979827434,
0.11434413934349097],
'sample_3_description': 'RN2TG',
'sample_3_label': 'label_5',
'sample_3_value': 40,
'sample_3_vector_': [0.28102035620163046,
0.7421090875142067,
0.09771653703658345,
0.10015420429876987,
0.13744357712958866]},
{'_chunk_': [{'label': 'label_0',
'label_chunkvector_': [0.44900339130825384,
0.8856780512547253,
0.5731744454632794,
0.07634302009769145,
0.126567766301261]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_0'}]}},
'_id': '9dc452ec-e60c-35e5-ad50-8abc544d72f5',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': '6WOTX',
'sample_1_label': 'label_5',
'sample_1_value': 90,
'sample_1_vector_': [0.36754556968019914,
0.7570935190789245,
0.07080217925165144,
0.0377899628386521,
0.010935468014863448],
'sample_2_description': 'C9AQY',
'sample_2_label': 'label_0',
'sample_2_value': 66,
'sample_2_vector_': [0.8841987637795244,
0.5798869557821004,
0.629484594620124,
0.15513971487981038,
0.06784721110008496],
'sample_3_description': 'CB9MB',
'sample_3_label': 'label_1',
'sample_3_value': 19,
'sample_3_vector_': [0.07807951748335318,
0.7070506382865839,
0.7331808226921382,
0.13633307017391627,
0.22967712634144954]},
{'_chunk_': [{'label': 'label_1',
'label_chunkvector_': [0.298123417344888,
0.6109539928925158,
0.594743730194975,
0.2648613560137232,
0.8339071789779628]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_1'}]}},
'_id': 'cecb1a97-3540-3b0f-a4b2-b7bed4df0e10',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': 'QJHMH',
'sample_1_label': 'label_3',
'sample_1_value': 42,
'sample_1_vector_': [0.664231043403755,
0.47220553157818856,
0.08584357353004624,
0.008458751015532395,
0.3591367465817318],
'sample_2_description': '18HJ7',
'sample_2_label': 'label_4',
'sample_2_value': 31,
'sample_2_vector_': [0.46163001848406293,
0.530708764060759,
0.9892401074533322,
0.2565786433160304,
0.36644315611129674],
'sample_3_description': 'Q8HBC',
'sample_3_label': 'label_0',
'sample_3_value': 46,
'sample_3_vector_': [0.15636989000338164,
0.30213016734011644,
0.5854349758809958,
0.6564881895528701,
0.7604572527984234]},
{'_chunk_': [{'label': 'label_3',
'label_chunkvector_': [0.2459123596946453,
0.9324565094950896,
0.27724503128111255,
0.0943163583176111,
0.9062322733100795]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_3'}]}},
'_id': '83644e36-9aea-36bf-8921-6763245fe23a',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': 'Q8ZGC',
'sample_1_label': 'label_0',
'sample_1_value': 25,
'sample_1_vector_': [0.36705678637922134,
0.5030829146042314,
0.27586504612917107,
0.04638466153973042,
0.6038331836372212],
'sample_2_description': 'BKQOW',
'sample_2_label': 'label_3',
'sample_2_value': 69,
'sample_2_vector_': [0.06599734377357402,
0.7291538497710904,
0.5723644440353702,
0.6404097412423622,
0.14369410325126808],
'sample_3_description': '0D1UW',
'sample_3_label': 'label_3',
'sample_3_value': 6,
'sample_3_vector_': [0.421115379610321,
0.3275935294784218,
0.058777940280584584,
0.04186263256123568,
0.6260049143683458]},
{'_chunk_': [{'label': 'label_4',
'label_chunkvector_': [0.08549435178564124,
0.11520069704151803,
0.43403327749130916,
0.01974440345523576,
0.14372394345151063]}],
'_chunk_results': {'_chunk_': {'_relevance': 0,
'results': [{'_relevance': 0, 'label': 'label_4'}]}},
'_id': '3495a820-467a-30d8-895b-50befba38f99',
'_relevance': 0,
'insert_date_': '2022-05-13T01:21:24.679Z',
'sample_1_description': 'MOEEO',
'sample_1_label': 'label_0',
'sample_1_value': 67,
'sample_1_vector_': [0.8513889401103251,
0.7485584349006119,
0.7453551223300326,
0.6314495537016419,
0.25585253601766167],
'sample_2_description': '78D55',
'sample_2_label': 'label_1',
'sample_2_value': 3,
'sample_2_vector_': [0.29248363029303037,
0.3989529263903293,
0.2237003035286077,
0.3232937426927007,
0.535646801886282],
'sample_3_description': '53VLV',
'sample_3_label': 'label_1',
'sample_3_value': 94,
'sample_3_vector_': [0.24476704213234246,
0.582106042132727,
0.8711476351278145,
0.540170037829761,
0.6652872417327402]}],
'resultsSize': 100}