AMMICO/build/html/notebooks/Example multimodal.ipynb

2004 строки
64 KiB
Plaintext

{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"id": "22df2297-0629-45aa-b88c-6c61f1544db6",
"metadata": {},
"source": [
"# Image Multimodal Search"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "9eeeb302-296e-48dc-86c7-254aa02f2b3a",
"metadata": {},
"source": [
"This notebooks shows how to carry out an image multimodal search with the [LAVIS](https://github.com/salesforce/LAVIS) library. \n",
"\n",
"The first cell is only run on google colab and installs the [ammico](https://github.com/ssciwr/AMMICO) package.\n",
"\n",
"After that, we can import `ammico` and read in the files given a folder path."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "0b0a6bdf",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:22.826250Z",
"iopub.status.busy": "2023-10-18T21:36:22.825863Z",
"iopub.status.idle": "2023-10-18T21:36:22.834510Z",
"shell.execute_reply": "2023-10-18T21:36:22.833555Z"
}
},
"outputs": [],
"source": [
"# if running on google colab\n",
"# flake8-noqa-cell\n",
"import os\n",
"\n",
"if \"google.colab\" in str(get_ipython()):\n",
" # update python version\n",
" # install setuptools\n",
" # %pip install setuptools==61 -qqq\n",
" # install ammico\n",
" %pip install git+https://github.com/ssciwr/ammico.git -qqq\n",
" # mount google drive for data and API key\n",
" from google.colab import drive\n",
"\n",
" drive.mount(\"/content/drive\")"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "f10ad6c9-b1a0-4043-8c5d-ed660d77be37",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:22.837225Z",
"iopub.status.busy": "2023-10-18T21:36:22.836814Z",
"iopub.status.idle": "2023-10-18T21:36:33.264880Z",
"shell.execute_reply": "2023-10-18T21:36:33.264240Z"
},
"tags": []
},
"outputs": [],
"source": [
"import ammico.utils as mutils\n",
"import ammico.multimodal_search as ms"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "8d3fe589-ff3c-4575-b8f5-650db85596bc",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:33.268332Z",
"iopub.status.busy": "2023-10-18T21:36:33.267516Z",
"iopub.status.idle": "2023-10-18T21:36:33.272410Z",
"shell.execute_reply": "2023-10-18T21:36:33.271730Z"
},
"tags": []
},
"outputs": [],
"source": [
"images = mutils.find_files(\n",
" path=\"data/\",\n",
" limit=10,\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "a08bd3a9-e954-4a0e-ad64-6817abd3a25a",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:33.277211Z",
"iopub.status.busy": "2023-10-18T21:36:33.275878Z",
"iopub.status.idle": "2023-10-18T21:36:33.285220Z",
"shell.execute_reply": "2023-10-18T21:36:33.284706Z"
}
},
"outputs": [
{
"data": {
"text/plain": [
"{'102141_2_eng': {'filename': 'data/102141_2_eng.png'},\n",
" '102730_eng': {'filename': 'data/102730_eng.png'},\n",
" '106349S_por': {'filename': 'data/106349S_por.png'}}"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"images"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "adf3db21-1f8b-4d44-bbef-ef0acf4623a0",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:33.288672Z",
"iopub.status.busy": "2023-10-18T21:36:33.288074Z",
"iopub.status.idle": "2023-10-18T21:36:33.291110Z",
"shell.execute_reply": "2023-10-18T21:36:33.290609Z"
},
"tags": []
},
"outputs": [],
"source": [
"mydict = mutils.initialize_dict(images)"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "4c091f95-07cf-42c3-82c8-5f3a3c5929f8",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:33.293605Z",
"iopub.status.busy": "2023-10-18T21:36:33.293270Z",
"iopub.status.idle": "2023-10-18T21:36:33.297253Z",
"shell.execute_reply": "2023-10-18T21:36:33.296767Z"
}
},
"outputs": [
{
"data": {
"text/plain": [
"{'102141_2_eng': {'filename': '102141_2_eng'},\n",
" '102730_eng': {'filename': '102730_eng'},\n",
" '106349S_por': {'filename': '106349S_por'}}"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"mydict"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "987540a8-d800-4c70-a76b-7bfabaf123fa",
"metadata": {},
"source": [
"## Indexing and extracting features from images in selected folder"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "66d6ede4-00bc-4aeb-9a36-e52d7de33fe5",
"metadata": {},
"source": [
"First you need to select a model. You can choose one of the following models: \n",
"- [blip](https://github.com/salesforce/BLIP)\n",
"- [blip2](https://huggingface.co/docs/transformers/main/model_doc/blip-2) \n",
"- [albef](https://github.com/salesforce/ALBEF) \n",
"- [clip_base](https://github.com/openai/CLIP/blob/main/model-card.md)\n",
"- [clip_vitl14](https://github.com/mlfoundations/open_clip) \n",
"- [clip_vitl14_336](https://github.com/mlfoundations/open_clip)"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "7bbca1f0-d4b0-43cd-8e05-ee39d37c328e",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:33.300124Z",
"iopub.status.busy": "2023-10-18T21:36:33.299544Z",
"iopub.status.idle": "2023-10-18T21:36:33.302425Z",
"shell.execute_reply": "2023-10-18T21:36:33.301938Z"
},
"tags": []
},
"outputs": [],
"source": [
"model_type = \"blip\"\n",
"# model_type = \"blip2\"\n",
"# model_type = \"albef\"\n",
"# model_type = \"clip_base\"\n",
"# model_type = \"clip_vitl14\"\n",
"# model_type = \"clip_vitl14_336\""
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "357828c9",
"metadata": {},
"source": [
"To process the loaded images using the selected model, use the below code:"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "f6f2c9b1-4a91-47cb-86b5-2c9c67e4837b",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:33.305125Z",
"iopub.status.busy": "2023-10-18T21:36:33.304574Z",
"iopub.status.idle": "2023-10-18T21:36:33.307468Z",
"shell.execute_reply": "2023-10-18T21:36:33.306967Z"
}
},
"outputs": [],
"source": [
"my_obj = ms.MultimodalSearch(mydict)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "16603ded-078e-4362-847b-57ad76829327",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:33.310100Z",
"iopub.status.busy": "2023-10-18T21:36:33.309545Z",
"iopub.status.idle": "2023-10-18T21:36:33.313480Z",
"shell.execute_reply": "2023-10-18T21:36:33.312994Z"
}
},
"outputs": [
{
"data": {
"text/plain": [
"{'102141_2_eng': {'filename': '102141_2_eng'},\n",
" '102730_eng': {'filename': '102730_eng'},\n",
" '106349S_por': {'filename': '106349S_por'}}"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"my_obj.subdict"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "ca095404-57d0-4f5d-aeb0-38c232252b17",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:33.316211Z",
"iopub.status.busy": "2023-10-18T21:36:33.315639Z",
"iopub.status.idle": "2023-10-18T21:36:57.319500Z",
"shell.execute_reply": "2023-10-18T21:36:57.318780Z"
},
"tags": []
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
"(…)bert-base-uncased/resolve/main/vocab.txt: 0%| | 0.00/232k [00:00<?, ?B/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
"(…)bert-base-uncased/resolve/main/vocab.txt: 100%|██████████| 232k/232k [00:00<00:00, 4.38MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
"(…)cased/resolve/main/tokenizer_config.json: 0%| | 0.00/28.0 [00:00<?, ?B/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
"(…)cased/resolve/main/tokenizer_config.json: 100%|██████████| 28.0/28.0 [00:00<00:00, 4.70kB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
"(…)rt-base-uncased/resolve/main/config.json: 0%| | 0.00/570 [00:00<?, ?B/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
"(…)rt-base-uncased/resolve/main/config.json: 100%|██████████| 570/570 [00:00<00:00, 517kB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 0.00/1.97G [00:00<?, ?B/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 160k/1.97G [00:00<22:53, 1.54MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 312k/1.97G [00:00<32:13, 1.09MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 432k/1.97G [00:00<34:53, 1.01MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 536k/1.97G [00:00<36:39, 962kB/s] "
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 632k/1.97G [00:00<37:21, 944kB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 728k/1.97G [00:00<37:58, 928kB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 824k/1.97G [00:00<38:33, 914kB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 920k/1.97G [00:00<37:50, 932kB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 1.03M/1.97G [00:01<33:16, 1.06MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 1.45M/1.97G [00:01<17:20, 2.03MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 0%| | 3.88M/1.97G [00:01<03:57, 8.88MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 1%| | 10.9M/1.97G [00:01<01:21, 25.8MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 1%| | 13.8M/1.97G [00:01<01:50, 19.0MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 1%| | 19.0M/1.97G [00:01<01:19, 26.3MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 1%| | 22.5M/1.97G [00:01<01:16, 27.3MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 1%|▏ | 26.6M/1.97G [00:01<01:08, 30.4MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 2%|▏ | 33.4M/1.97G [00:02<00:51, 40.8MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 2%|▏ | 40.8M/1.97G [00:02<00:40, 50.6MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 3%|▎ | 50.9M/1.97G [00:02<00:31, 65.9MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 3%|▎ | 61.2M/1.97G [00:02<00:26, 77.7MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 4%|▎ | 72.9M/1.97G [00:02<00:27, 74.3MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 4%|▍ | 80.3M/1.97G [00:02<00:28, 70.1MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 4%|▍ | 88.0M/1.97G [00:02<00:29, 68.1MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 5%|▍ | 97.7M/1.97G [00:02<00:26, 76.4MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 5%|▌ | 110M/1.97G [00:03<00:21, 91.2MB/s] "
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 6%|▌ | 119M/1.97G [00:03<00:23, 85.4MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 6%|▋ | 128M/1.97G [00:03<00:24, 81.9MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 7%|▋ | 136M/1.97G [00:03<00:26, 75.2MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 7%|▋ | 148M/1.97G [00:03<00:22, 88.1MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 8%|▊ | 160M/1.97G [00:03<00:19, 97.9MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 9%|▉ | 182M/1.97G [00:03<00:14, 134MB/s] "
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 10%|▉ | 198M/1.97G [00:03<00:13, 145MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 11%|█ | 218M/1.97G [00:03<00:11, 164MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 12%|█▏ | 237M/1.97G [00:04<00:10, 175MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 13%|█▎ | 257M/1.97G [00:04<00:10, 184MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 14%|█▍ | 278M/1.97G [00:04<00:09, 195MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 15%|█▍ | 297M/1.97G [00:04<00:09, 185MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 16%|█▌ | 317M/1.97G [00:04<00:09, 191MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 17%|█▋ | 337M/1.97G [00:04<00:08, 198MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 18%|█▊ | 356M/1.97G [00:04<00:09, 184MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 19%|█▊ | 375M/1.97G [00:04<00:09, 188MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 19%|█▉ | 393M/1.97G [00:04<00:11, 152MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 21%|██ | 418M/1.97G [00:05<00:09, 177MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 22%|██▏ | 441M/1.97G [00:05<00:08, 195MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 23%|██▎ | 468M/1.97G [00:05<00:07, 218MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 24%|██▍ | 491M/1.97G [00:05<00:07, 225MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 26%|██▌ | 515M/1.97G [00:05<00:06, 234MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 27%|██▋ | 538M/1.97G [00:05<00:11, 141MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 28%|██▊ | 556M/1.97G [00:05<00:12, 122MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 29%|██▊ | 579M/1.97G [00:06<00:10, 143MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 30%|██▉ | 598M/1.97G [00:06<00:09, 156MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 31%|███ | 616M/1.97G [00:06<00:15, 97.9MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 32%|███▏ | 636M/1.97G [00:06<00:12, 117MB/s] "
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 32%|███▏ | 652M/1.97G [00:06<00:11, 125MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 33%|███▎ | 668M/1.97G [00:06<00:10, 131MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 34%|███▍ | 683M/1.97G [00:07<00:21, 66.0MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 35%|███▍ | 705M/1.97G [00:07<00:15, 87.7MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 36%|███▌ | 724M/1.97G [00:07<00:12, 106MB/s] "
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 37%|███▋ | 740M/1.97G [00:07<00:12, 108MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 38%|███▊ | 758M/1.97G [00:07<00:10, 125MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 38%|███▊ | 774M/1.97G [00:08<00:11, 110MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 39%|███▉ | 795M/1.97G [00:08<00:09, 132MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 40%|████ | 813M/1.97G [00:08<00:08, 145MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 41%|████▏ | 833M/1.97G [00:08<00:07, 162MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 42%|████▏ | 853M/1.97G [00:08<00:07, 174MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 43%|████▎ | 877M/1.97G [00:08<00:06, 195MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 45%|████▍ | 899M/1.97G [00:08<00:05, 206MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 46%|████▌ | 920M/1.97G [00:08<00:05, 205MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 47%|████▋ | 940M/1.97G [00:08<00:05, 203MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 48%|████▊ | 963M/1.97G [00:09<00:05, 212MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 49%|████▊ | 983M/1.97G [00:09<00:05, 206MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 50%|████▉ | 0.98G/1.97G [00:09<00:04, 218MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 51%|█████ | 1.00G/1.97G [00:09<00:05, 189MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 52%|█████▏ | 1.03G/1.97G [00:09<00:05, 199MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 53%|█████▎ | 1.05G/1.97G [00:09<00:04, 206MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 54%|█████▍ | 1.07G/1.97G [00:09<00:04, 203MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 55%|█████▌ | 1.09G/1.97G [00:09<00:04, 207MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 56%|█████▌ | 1.11G/1.97G [00:09<00:04, 207MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 57%|█████▋ | 1.13G/1.97G [00:10<00:04, 209MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 58%|█████▊ | 1.15G/1.97G [00:10<00:04, 208MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 59%|█████▉ | 1.17G/1.97G [00:10<00:04, 211MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 60%|██████ | 1.19G/1.97G [00:10<00:04, 199MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 61%|██████ | 1.20G/1.97G [00:10<00:05, 143MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 62%|██████▏ | 1.22G/1.97G [00:10<00:05, 152MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 63%|██████▎ | 1.24G/1.97G [00:10<00:04, 168MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 64%|██████▍ | 1.26G/1.97G [00:10<00:04, 162MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 65%|██████▍ | 1.28G/1.97G [00:10<00:04, 169MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 66%|██████▌ | 1.30G/1.97G [00:11<00:03, 185MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 67%|██████▋ | 1.32G/1.97G [00:11<00:03, 186MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 68%|██████▊ | 1.34G/1.97G [00:11<00:03, 198MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 69%|██████▉ | 1.36G/1.97G [00:11<00:03, 215MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 70%|███████ | 1.38G/1.97G [00:11<00:03, 207MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 71%|███████▏ | 1.40G/1.97G [00:11<00:02, 218MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 72%|███████▏ | 1.43G/1.97G [00:11<00:02, 229MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 74%|███████▎ | 1.45G/1.97G [00:11<00:02, 229MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 75%|███████▍ | 1.47G/1.97G [00:11<00:02, 233MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 76%|███████▌ | 1.50G/1.97G [00:12<00:02, 245MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 77%|███████▋ | 1.52G/1.97G [00:12<00:02, 234MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 78%|███████▊ | 1.54G/1.97G [00:12<00:02, 209MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 79%|███████▉ | 1.56G/1.97G [00:12<00:02, 209MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 80%|████████ | 1.58G/1.97G [00:12<00:03, 124MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 81%|████████▏ | 1.60G/1.97G [00:12<00:02, 140MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 83%|████████▎ | 1.63G/1.97G [00:12<00:02, 180MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 84%|████████▍ | 1.65G/1.97G [00:13<00:01, 186MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 85%|████████▍ | 1.67G/1.97G [00:13<00:01, 193MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 86%|████████▌ | 1.69G/1.97G [00:13<00:01, 198MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 87%|████████▋ | 1.72G/1.97G [00:13<00:01, 224MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 88%|████████▊ | 1.74G/1.97G [00:13<00:01, 226MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 90%|████████▉ | 1.77G/1.97G [00:13<00:00, 226MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 91%|█████████ | 1.79G/1.97G [00:13<00:00, 219MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 92%|█████████▏| 1.81G/1.97G [00:13<00:00, 221MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 93%|█████████▎| 1.83G/1.97G [00:13<00:00, 231MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 94%|█████████▍| 1.85G/1.97G [00:13<00:00, 220MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 95%|█████████▌| 1.88G/1.97G [00:14<00:00, 209MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 96%|█████████▌| 1.90G/1.97G [00:14<00:00, 210MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 97%|█████████▋| 1.92G/1.97G [00:14<00:00, 182MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 98%|█████████▊| 1.94G/1.97G [00:14<00:00, 201MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
" 99%|█████████▉| 1.96G/1.97G [00:14<00:00, 196MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\r",
"100%|██████████| 1.97G/1.97G [00:14<00:00, 145MB/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\n"
]
},
{
"ename": "FileNotFoundError",
"evalue": "[Errno 2] No such file or directory: '102141_2_eng'",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mFileNotFoundError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[10], line 8\u001b[0m\n\u001b[1;32m 1\u001b[0m (\n\u001b[1;32m 2\u001b[0m model,\n\u001b[1;32m 3\u001b[0m vis_processors,\n\u001b[1;32m 4\u001b[0m txt_processors,\n\u001b[1;32m 5\u001b[0m image_keys,\n\u001b[1;32m 6\u001b[0m image_names,\n\u001b[1;32m 7\u001b[0m features_image_stacked,\n\u001b[0;32m----> 8\u001b[0m ) \u001b[38;5;241m=\u001b[39m \u001b[43mmy_obj\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mparsing_images\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 9\u001b[0m \u001b[43m \u001b[49m\u001b[43mmodel_type\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 10\u001b[0m \u001b[43m \u001b[49m\u001b[43mpath_to_save_tensors\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mdata/\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 11\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/work/AMMICO/AMMICO/ammico/multimodal_search.py:371\u001b[0m, in \u001b[0;36mMultimodalSearch.parsing_images\u001b[0;34m(self, model_type, path_to_save_tensors, path_to_load_tensors)\u001b[0m\n\u001b[1;32m 366\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 367\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mSyntaxError\u001b[39;00m(\n\u001b[1;32m 368\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mPlease, use one of the following models: blip2, blip, albef, clip_base, clip_vitl14, clip_vitl14_336\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 369\u001b[0m )\n\u001b[0;32m--> 371\u001b[0m _, images_tensors \u001b[38;5;241m=\u001b[39m \u001b[43mMultimodalSearch\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread_and_process_images\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 372\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mimage_names\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mvis_processors\u001b[49m\n\u001b[1;32m 373\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 374\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m path_to_load_tensors \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[1;32m 375\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m torch\u001b[38;5;241m.\u001b[39mno_grad():\n",
"File \u001b[0;32m~/work/AMMICO/AMMICO/ammico/multimodal_search.py:168\u001b[0m, in \u001b[0;36mMultimodalSearch.read_and_process_images\u001b[0;34m(self, image_paths, vis_processor)\u001b[0m\n\u001b[1;32m 156\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mread_and_process_images\u001b[39m(\u001b[38;5;28mself\u001b[39m, image_paths: \u001b[38;5;28mlist\u001b[39m, vis_processor) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m \u001b[38;5;28mtuple\u001b[39m:\n\u001b[1;32m 157\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m 158\u001b[0m \u001b[38;5;124;03m Read and process images with vis_processor.\u001b[39;00m\n\u001b[1;32m 159\u001b[0m \n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 166\u001b[0m \u001b[38;5;124;03m images_tensors (torch.Tensor): tensors of images stacked in device.\u001b[39;00m\n\u001b[1;32m 167\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 168\u001b[0m raw_images \u001b[38;5;241m=\u001b[39m [MultimodalSearch\u001b[38;5;241m.\u001b[39mread_img(\u001b[38;5;28mself\u001b[39m, path) \u001b[38;5;28;01mfor\u001b[39;00m path \u001b[38;5;129;01min\u001b[39;00m image_paths]\n\u001b[1;32m 169\u001b[0m images \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 170\u001b[0m vis_processor[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124meval\u001b[39m\u001b[38;5;124m\"\u001b[39m](r_img)\n\u001b[1;32m 171\u001b[0m \u001b[38;5;241m.\u001b[39munsqueeze(\u001b[38;5;241m0\u001b[39m)\n\u001b[1;32m 172\u001b[0m \u001b[38;5;241m.\u001b[39mto(MultimodalSearch\u001b[38;5;241m.\u001b[39mmultimodal_device)\n\u001b[1;32m 173\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m r_img \u001b[38;5;129;01min\u001b[39;00m raw_images\n\u001b[1;32m 174\u001b[0m ]\n\u001b[1;32m 175\u001b[0m images_tensors \u001b[38;5;241m=\u001b[39m torch\u001b[38;5;241m.\u001b[39mstack(images)\n",
"File \u001b[0;32m~/work/AMMICO/AMMICO/ammico/multimodal_search.py:168\u001b[0m, in \u001b[0;36m<listcomp>\u001b[0;34m(.0)\u001b[0m\n\u001b[1;32m 156\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mread_and_process_images\u001b[39m(\u001b[38;5;28mself\u001b[39m, image_paths: \u001b[38;5;28mlist\u001b[39m, vis_processor) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m \u001b[38;5;28mtuple\u001b[39m:\n\u001b[1;32m 157\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m 158\u001b[0m \u001b[38;5;124;03m Read and process images with vis_processor.\u001b[39;00m\n\u001b[1;32m 159\u001b[0m \n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 166\u001b[0m \u001b[38;5;124;03m images_tensors (torch.Tensor): tensors of images stacked in device.\u001b[39;00m\n\u001b[1;32m 167\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 168\u001b[0m raw_images \u001b[38;5;241m=\u001b[39m [\u001b[43mMultimodalSearch\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread_img\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mpath\u001b[49m\u001b[43m)\u001b[49m \u001b[38;5;28;01mfor\u001b[39;00m path \u001b[38;5;129;01min\u001b[39;00m image_paths]\n\u001b[1;32m 169\u001b[0m images \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 170\u001b[0m vis_processor[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124meval\u001b[39m\u001b[38;5;124m\"\u001b[39m](r_img)\n\u001b[1;32m 171\u001b[0m \u001b[38;5;241m.\u001b[39munsqueeze(\u001b[38;5;241m0\u001b[39m)\n\u001b[1;32m 172\u001b[0m \u001b[38;5;241m.\u001b[39mto(MultimodalSearch\u001b[38;5;241m.\u001b[39mmultimodal_device)\n\u001b[1;32m 173\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m r_img \u001b[38;5;129;01min\u001b[39;00m raw_images\n\u001b[1;32m 174\u001b[0m ]\n\u001b[1;32m 175\u001b[0m images_tensors \u001b[38;5;241m=\u001b[39m torch\u001b[38;5;241m.\u001b[39mstack(images)\n",
"File \u001b[0;32m~/work/AMMICO/AMMICO/ammico/multimodal_search.py:153\u001b[0m, in \u001b[0;36mMultimodalSearch.read_img\u001b[0;34m(self, filepath)\u001b[0m\n\u001b[1;32m 143\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mread_img\u001b[39m(\u001b[38;5;28mself\u001b[39m, filepath: \u001b[38;5;28mstr\u001b[39m) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m Image:\n\u001b[1;32m 144\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m 145\u001b[0m \u001b[38;5;124;03m Load Image from filepath.\u001b[39;00m\n\u001b[1;32m 146\u001b[0m \n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 151\u001b[0m \u001b[38;5;124;03m raw_image (PIL.Image): image.\u001b[39;00m\n\u001b[1;32m 152\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 153\u001b[0m raw_image \u001b[38;5;241m=\u001b[39m \u001b[43mImage\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mopen\u001b[49m\u001b[43m(\u001b[49m\u001b[43mfilepath\u001b[49m\u001b[43m)\u001b[49m\u001b[38;5;241m.\u001b[39mconvert(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mRGB\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[1;32m 154\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m raw_image\n",
"File \u001b[0;32m/opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/site-packages/PIL/Image.py:3243\u001b[0m, in \u001b[0;36mopen\u001b[0;34m(fp, mode, formats)\u001b[0m\n\u001b[1;32m 3240\u001b[0m filename \u001b[38;5;241m=\u001b[39m fp\n\u001b[1;32m 3242\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m filename:\n\u001b[0;32m-> 3243\u001b[0m fp \u001b[38;5;241m=\u001b[39m \u001b[43mbuiltins\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mopen\u001b[49m\u001b[43m(\u001b[49m\u001b[43mfilename\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mrb\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[1;32m 3244\u001b[0m exclusive_fp \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m\n\u001b[1;32m 3246\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n",
"\u001b[0;31mFileNotFoundError\u001b[0m: [Errno 2] No such file or directory: '102141_2_eng'"
]
}
],
"source": [
"(\n",
" model,\n",
" vis_processors,\n",
" txt_processors,\n",
" image_keys,\n",
" image_names,\n",
" features_image_stacked,\n",
") = my_obj.parsing_images(\n",
" model_type, \n",
" path_to_save_tensors=\"data/\",\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "f236c3b1-c3a6-471a-9fc5-ef831b675286",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.322855Z",
"iopub.status.busy": "2023-10-18T21:36:57.322287Z",
"iopub.status.idle": "2023-10-18T21:36:57.352057Z",
"shell.execute_reply": "2023-10-18T21:36:57.351410Z"
}
},
"outputs": [
{
"ename": "NameError",
"evalue": "name 'features_image_stacked' is not defined",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[11], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m \u001b[43mfeatures_image_stacked\u001b[49m\n",
"\u001b[0;31mNameError\u001b[0m: name 'features_image_stacked' is not defined"
]
}
],
"source": [
"features_image_stacked"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "9ff8a894-566b-4c4f-acca-21c50b5b1f52",
"metadata": {},
"source": [
"The images are then processed and stored in a numerical representation, a tensor. These tensors do not change for the same image and same model - so if you run this analysis once, and save the tensors giving a path with the keyword `path_to_save_tensors`, a file with filename `.<Number_of_images>_<model_name>_saved_features_image.pt` will be placed there.\n",
"\n",
"This will save you a lot of time if you want to analyse same images with the same model but different questions. To run using the saved tensors, execute the below code giving the path and name of the tensor file."
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "56c6d488-f093-4661-835a-5c73a329c874",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.355315Z",
"iopub.status.busy": "2023-10-18T21:36:57.354703Z",
"iopub.status.idle": "2023-10-18T21:36:57.358439Z",
"shell.execute_reply": "2023-10-18T21:36:57.357888Z"
},
"tags": []
},
"outputs": [],
"source": [
"# (\n",
"# model,\n",
"# vis_processors,\n",
"# txt_processors,\n",
"# image_keys,\n",
"# image_names,\n",
"# features_image_stacked,\n",
"# ) = my_obj.parsing_images(\n",
"# model_type,\n",
"# path_to_load_tensors=\"/content/drive/MyDrive/misinformation-data/5_clip_base_saved_features_image.pt\",\n",
"# )"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "309923c1-d6f8-4424-8fca-bde5f3a98b38",
"metadata": {},
"source": [
"Here we already processed our image folder with 5 images and the `clip_base` model. So you need just to write the name `5_clip_base_saved_features_image.pt` of the saved file that consists of tensors of all images as keyword argument for `path_to_load_tensors`. "
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "162a52e8-6652-4897-b92e-645cab07aaef",
"metadata": {},
"source": [
"## Formulate your search queries\n",
"\n",
"Next, you need to form search queries. You can search either by image or by text. You can search for a single query, or you can search for several queries at once, the computational time should not be much different. The format of the queries is as follows:"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "c4196a52-d01e-42e4-8674-5712f7d6f792",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.361401Z",
"iopub.status.busy": "2023-10-18T21:36:57.360979Z",
"iopub.status.idle": "2023-10-18T21:36:57.365485Z",
"shell.execute_reply": "2023-10-18T21:36:57.364755Z"
},
"tags": []
},
"outputs": [],
"source": [
"search_query3 = [\n",
" {\"text_input\": \"politician press conference\"},\n",
" {\"text_input\": \"a world map\"},\n",
" {\"text_input\": \"a dog\"},\n",
"]"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "8bcf3127-3dfd-4ff4-b9e7-a043099b1418",
"metadata": {},
"source": [
"You can filter your results in 3 different ways:\n",
"- `filter_number_of_images` limits the number of images found. That is, if the parameter `filter_number_of_images = 10`, then the first 10 images that best match the query will be shown. The other images ranks will be set to `None` and the similarity value to `0`.\n",
"- `filter_val_limit` limits the output of images with a similarity value not bigger than `filter_val_limit`. That is, if the parameter `filter_val_limit = 0.2`, all images with similarity less than 0.2 will be discarded.\n",
"- `filter_rel_error` (percentage) limits the output of images with a similarity value not bigger than `100 * abs(current_simularity_value - best_simularity_value_in_current_search)/best_simularity_value_in_current_search < filter_rel_error`. That is, if we set filter_rel_error = 30, it means that if the top1 image have 0.5 similarity value, we discard all image with similarity less than 0.35."
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "7f7dc52f-7ee9-4590-96b7-e0d9d3b82378",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.368359Z",
"iopub.status.busy": "2023-10-18T21:36:57.367837Z",
"iopub.status.idle": "2023-10-18T21:36:57.395892Z",
"shell.execute_reply": "2023-10-18T21:36:57.395249Z"
},
"tags": []
},
"outputs": [
{
"ename": "NameError",
"evalue": "name 'model' is not defined",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[14], line 2\u001b[0m\n\u001b[1;32m 1\u001b[0m similarity, sorted_lists \u001b[38;5;241m=\u001b[39m my_obj\u001b[38;5;241m.\u001b[39mmultimodal_search(\n\u001b[0;32m----> 2\u001b[0m \u001b[43mmodel\u001b[49m,\n\u001b[1;32m 3\u001b[0m vis_processors,\n\u001b[1;32m 4\u001b[0m txt_processors,\n\u001b[1;32m 5\u001b[0m model_type,\n\u001b[1;32m 6\u001b[0m image_keys,\n\u001b[1;32m 7\u001b[0m features_image_stacked,\n\u001b[1;32m 8\u001b[0m search_query3,\n\u001b[1;32m 9\u001b[0m filter_number_of_images\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m20\u001b[39m,\n\u001b[1;32m 10\u001b[0m )\n",
"\u001b[0;31mNameError\u001b[0m: name 'model' is not defined"
]
}
],
"source": [
"similarity, sorted_lists = my_obj.multimodal_search(\n",
" model,\n",
" vis_processors,\n",
" txt_processors,\n",
" model_type,\n",
" image_keys,\n",
" features_image_stacked,\n",
" search_query3,\n",
" filter_number_of_images=20,\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "65210ca2-b674-44bd-807a-4165e14bad74",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.398627Z",
"iopub.status.busy": "2023-10-18T21:36:57.398193Z",
"iopub.status.idle": "2023-10-18T21:36:57.424165Z",
"shell.execute_reply": "2023-10-18T21:36:57.423588Z"
}
},
"outputs": [
{
"ename": "NameError",
"evalue": "name 'similarity' is not defined",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[15], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m \u001b[43msimilarity\u001b[49m\n",
"\u001b[0;31mNameError\u001b[0m: name 'similarity' is not defined"
]
}
],
"source": [
"similarity"
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "557473df-e2b9-4ef0-9439-3daadf6741ac",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.426994Z",
"iopub.status.busy": "2023-10-18T21:36:57.426460Z",
"iopub.status.idle": "2023-10-18T21:36:57.454919Z",
"shell.execute_reply": "2023-10-18T21:36:57.453924Z"
}
},
"outputs": [
{
"ename": "NameError",
"evalue": "name 'sorted_lists' is not defined",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[16], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m \u001b[43msorted_lists\u001b[49m\n",
"\u001b[0;31mNameError\u001b[0m: name 'sorted_lists' is not defined"
]
}
],
"source": [
"sorted_lists"
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "c93d7e88-594d-4095-b5f2-7bf01210dc61",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.457964Z",
"iopub.status.busy": "2023-10-18T21:36:57.457422Z",
"iopub.status.idle": "2023-10-18T21:36:57.463804Z",
"shell.execute_reply": "2023-10-18T21:36:57.463242Z"
}
},
"outputs": [
{
"data": {
"text/plain": [
"{'102141_2_eng': {'filename': '102141_2_eng'},\n",
" '102730_eng': {'filename': '102730_eng'},\n",
" '106349S_por': {'filename': '106349S_por'}}"
]
},
"execution_count": 17,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"mydict"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "e1cf7e46-0c2c-4fb2-b89a-ef585ccb9339",
"metadata": {},
"source": [
"After launching `multimodal_search` function, the results of each query will be added to the source dictionary. "
]
},
{
"cell_type": "code",
"execution_count": 18,
"id": "9ad74b21-6187-4a58-9ed8-fd3e80f5a4ed",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.466849Z",
"iopub.status.busy": "2023-10-18T21:36:57.466297Z",
"iopub.status.idle": "2023-10-18T21:36:57.472544Z",
"shell.execute_reply": "2023-10-18T21:36:57.471930Z"
},
"tags": []
},
"outputs": [
{
"data": {
"text/plain": [
"{'filename': '106349S_por'}"
]
},
"execution_count": 18,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"mydict[\"106349S_por\"]"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "cd3ee120-8561-482b-a76a-e8f996783325",
"metadata": {},
"source": [
"A special function was written to present the search results conveniently. "
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "4324e4fd-e9aa-4933-bb12-074d54e0c510",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.475401Z",
"iopub.status.busy": "2023-10-18T21:36:57.474990Z",
"iopub.status.idle": "2023-10-18T21:36:57.559875Z",
"shell.execute_reply": "2023-10-18T21:36:57.559211Z"
},
"tags": []
},
"outputs": [
{
"data": {
"text/plain": [
"'Your search query: politician press conference'"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/plain": [
"'--------------------------------------------------'"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/plain": [
"'Results:'"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"ename": "KeyError",
"evalue": "'politician press conference'",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mKeyError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[19], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m \u001b[43mmy_obj\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mshow_results\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 2\u001b[0m \u001b[43m \u001b[49m\u001b[43msearch_query3\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;241;43m0\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 3\u001b[0m \u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/work/AMMICO/AMMICO/ammico/multimodal_search.py:970\u001b[0m, in \u001b[0;36mMultimodalSearch.show_results\u001b[0;34m(self, query, itm, image_gradcam_with_itm)\u001b[0m\n\u001b[1;32m 967\u001b[0m current_querry_val \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mlist\u001b[39m(query\u001b[38;5;241m.\u001b[39mvalues())[\u001b[38;5;241m0\u001b[39m]\n\u001b[1;32m 968\u001b[0m current_querry_rank \u001b[38;5;241m=\u001b[39m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mrank \u001b[39m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;241m+\u001b[39m \u001b[38;5;28mlist\u001b[39m(query\u001b[38;5;241m.\u001b[39mvalues())[\u001b[38;5;241m0\u001b[39m]\n\u001b[0;32m--> 970\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m s \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28;43msorted\u001b[39;49m\u001b[43m(\u001b[49m\n\u001b[1;32m 971\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msubdict\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mitems\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mkey\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mlambda\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mt\u001b[49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mt\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;241;43m1\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m[\u001b[49m\u001b[43mcurrent_querry_val\u001b[49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mreverse\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mTrue\u001b[39;49;00m\n\u001b[1;32m 972\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m:\n\u001b[1;32m 973\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m s[\u001b[38;5;241m1\u001b[39m][current_querry_rank] \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[1;32m 974\u001b[0m \u001b[38;5;28;01mbreak\u001b[39;00m\n",
"File \u001b[0;32m~/work/AMMICO/AMMICO/ammico/multimodal_search.py:971\u001b[0m, in \u001b[0;36mMultimodalSearch.show_results.<locals>.<lambda>\u001b[0;34m(t)\u001b[0m\n\u001b[1;32m 967\u001b[0m current_querry_val \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mlist\u001b[39m(query\u001b[38;5;241m.\u001b[39mvalues())[\u001b[38;5;241m0\u001b[39m]\n\u001b[1;32m 968\u001b[0m current_querry_rank \u001b[38;5;241m=\u001b[39m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mrank \u001b[39m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;241m+\u001b[39m \u001b[38;5;28mlist\u001b[39m(query\u001b[38;5;241m.\u001b[39mvalues())[\u001b[38;5;241m0\u001b[39m]\n\u001b[1;32m 970\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m s \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28msorted\u001b[39m(\n\u001b[0;32m--> 971\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39msubdict\u001b[38;5;241m.\u001b[39mitems(), key\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mlambda\u001b[39;00m t: \u001b[43mt\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;241;43m1\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m[\u001b[49m\u001b[43mcurrent_querry_val\u001b[49m\u001b[43m]\u001b[49m, reverse\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mTrue\u001b[39;00m\n\u001b[1;32m 972\u001b[0m ):\n\u001b[1;32m 973\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m s[\u001b[38;5;241m1\u001b[39m][current_querry_rank] \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[1;32m 974\u001b[0m \u001b[38;5;28;01mbreak\u001b[39;00m\n",
"\u001b[0;31mKeyError\u001b[0m: 'politician press conference'"
]
}
],
"source": [
"my_obj.show_results(\n",
" search_query3[0],\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "0b750e9f-fe64-4028-9caf-52d7187462f1",
"metadata": {},
"source": [
"## Improve the search results\n",
"\n",
"For even better results, a slightly different approach has been prepared that can improve search results. It is quite resource-intensive, so it is applied after the main algorithm has found the most relevant images. This approach works only with text queries. Among the parameters you can choose 3 models: `\"blip_base\"`, `\"blip_large\"`, `\"blip2_coco\"`. If you get an `Out of Memory` error, try reducing the batch_size value (minimum = 1), which is the number of images being processed simultaneously. With the parameter `need_grad_cam = True/False` you can enable the calculation of the heat map of each image to be processed. Thus the `image_text_match_reordering` function calculates new similarity values and new ranks for each image. The resulting values are added to the general dictionary."
]
},
{
"cell_type": "code",
"execution_count": 20,
"id": "b3af7b39-6d0d-4da3-9b8f-7dfd3f5779be",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.563047Z",
"iopub.status.busy": "2023-10-18T21:36:57.562669Z",
"iopub.status.idle": "2023-10-18T21:36:57.566910Z",
"shell.execute_reply": "2023-10-18T21:36:57.566315Z"
},
"tags": []
},
"outputs": [],
"source": [
"itm_model = \"blip_base\"\n",
"# itm_model = \"blip_large\"\n",
"# itm_model = \"blip2_coco\""
]
},
{
"cell_type": "code",
"execution_count": 21,
"id": "caf1f4ae-4b37-4954-800e-7120f0419de5",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.569655Z",
"iopub.status.busy": "2023-10-18T21:36:57.569237Z",
"iopub.status.idle": "2023-10-18T21:36:57.602559Z",
"shell.execute_reply": "2023-10-18T21:36:57.601944Z"
},
"tags": []
},
"outputs": [
{
"ename": "NameError",
"evalue": "name 'image_keys' is not defined",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[21], line 4\u001b[0m\n\u001b[1;32m 1\u001b[0m itm_scores, image_gradcam_with_itm \u001b[38;5;241m=\u001b[39m my_obj\u001b[38;5;241m.\u001b[39mimage_text_match_reordering(\n\u001b[1;32m 2\u001b[0m search_query3,\n\u001b[1;32m 3\u001b[0m itm_model,\n\u001b[0;32m----> 4\u001b[0m \u001b[43mimage_keys\u001b[49m,\n\u001b[1;32m 5\u001b[0m sorted_lists,\n\u001b[1;32m 6\u001b[0m batch_size\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m1\u001b[39m,\n\u001b[1;32m 7\u001b[0m need_grad_cam\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mTrue\u001b[39;00m,\n\u001b[1;32m 8\u001b[0m )\n",
"\u001b[0;31mNameError\u001b[0m: name 'image_keys' is not defined"
]
}
],
"source": [
"itm_scores, image_gradcam_with_itm = my_obj.image_text_match_reordering(\n",
" search_query3,\n",
" itm_model,\n",
" image_keys,\n",
" sorted_lists,\n",
" batch_size=1,\n",
" need_grad_cam=True,\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "9e98c150-5fab-4251-bce7-0d8fc7b385b9",
"metadata": {},
"source": [
"Then using the same output function you can add the `ITM=True` arguments to output the new image order. You can also add the `image_gradcam_with_itm` argument to output the heat maps of the calculated images. "
]
},
{
"cell_type": "code",
"execution_count": 22,
"id": "6a829b99-5230-463a-8b11-30ffbb67fc3a",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.605638Z",
"iopub.status.busy": "2023-10-18T21:36:57.605195Z",
"iopub.status.idle": "2023-10-18T21:36:57.633844Z",
"shell.execute_reply": "2023-10-18T21:36:57.633261Z"
},
"tags": []
},
"outputs": [
{
"ename": "NameError",
"evalue": "name 'image_gradcam_with_itm' is not defined",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[22], line 2\u001b[0m\n\u001b[1;32m 1\u001b[0m my_obj\u001b[38;5;241m.\u001b[39mshow_results(\n\u001b[0;32m----> 2\u001b[0m search_query3[\u001b[38;5;241m0\u001b[39m], itm\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mTrue\u001b[39;00m, image_gradcam_with_itm\u001b[38;5;241m=\u001b[39m\u001b[43mimage_gradcam_with_itm\u001b[49m\n\u001b[1;32m 3\u001b[0m )\n",
"\u001b[0;31mNameError\u001b[0m: name 'image_gradcam_with_itm' is not defined"
]
}
],
"source": [
"my_obj.show_results(\n",
" search_query3[0], itm=True, image_gradcam_with_itm=image_gradcam_with_itm\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "d86ab96b-1907-4b7f-a78e-3983b516d781",
"metadata": {
"tags": []
},
"source": [
"## Save search results to csv"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "4bdbc4d4-695d-4751-ab7c-d2d98e2917d7",
"metadata": {
"tags": []
},
"source": [
"Convert the dictionary of dictionarys into a dictionary with lists:"
]
},
{
"cell_type": "code",
"execution_count": 23,
"id": "6c6ddd83-bc87-48f2-a8d6-1bd3f4201ff7",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.636852Z",
"iopub.status.busy": "2023-10-18T21:36:57.636497Z",
"iopub.status.idle": "2023-10-18T21:36:57.642086Z",
"shell.execute_reply": "2023-10-18T21:36:57.641517Z"
},
"tags": []
},
"outputs": [],
"source": [
"outdict = mutils.append_data_to_dict(mydict)\n",
"df = mutils.dump_df(outdict)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "ea2675d5-604c-45e7-86d2-080b1f4559a0",
"metadata": {
"tags": []
},
"source": [
"Check the dataframe:"
]
},
{
"cell_type": "code",
"execution_count": 24,
"id": "e78646d6-80be-4d3e-8123-3360957bcaa8",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.644823Z",
"iopub.status.busy": "2023-10-18T21:36:57.644464Z",
"iopub.status.idle": "2023-10-18T21:36:57.657086Z",
"shell.execute_reply": "2023-10-18T21:36:57.655961Z"
},
"tags": []
},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>filename</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>102141_2_eng</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>102730_eng</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>106349S_por</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" filename\n",
"0 102141_2_eng\n",
"1 102730_eng\n",
"2 106349S_por"
]
},
"execution_count": 24,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"df.head(10)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "05546d99-afab-4565-8f30-f14e1426abcf",
"metadata": {},
"source": [
"Write the csv file:"
]
},
{
"cell_type": "code",
"execution_count": 25,
"id": "185f7dde-20dc-44d8-9ab0-de41f9b5734d",
"metadata": {
"execution": {
"iopub.execute_input": "2023-10-18T21:36:57.659861Z",
"iopub.status.busy": "2023-10-18T21:36:57.659434Z",
"iopub.status.idle": "2023-10-18T21:36:57.665171Z",
"shell.execute_reply": "2023-10-18T21:36:57.664594Z"
},
"tags": []
},
"outputs": [],
"source": [
"df.to_csv(\"data/data_out.csv\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "b6a79201-7c17-496c-a6a1-b8ecfd3dd1e8",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.18"
}
},
"nbformat": 4,
"nbformat_minor": 5
}