From 6f75ee7f51eeac0f93568a2856637f86a1472fb1 Mon Sep 17 00:00:00 2001 From: iulusoy Date: Fri, 19 Sep 2025 08:35:52 +0000 Subject: [PATCH] =?UTF-8?q?Deploying=20to=20gh-pages=20from=20@=20ssciwr/A?= =?UTF-8?q?MMICO@f0cd69286a939beec81d8ac02028c520fc081ea1=20=F0=9F=9A=80?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- build/doctrees/environment.pickle | Bin 90221 -> 90221 bytes .../notebooks/DemoNotebook_ammico.doctree | Bin 187431 -> 187431 bytes build/html/notebooks/DemoNotebook_ammico.html | 6 +++--- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/build/doctrees/environment.pickle b/build/doctrees/environment.pickle index e513cc8afef96dfd4be5ec16fdfdb87b91774e17..93d60140250cb6356c94195d7007de40a9499a68 100644 GIT binary patch delta 5635 zcmZXYd0f?17Qj6qJP_N8z%F_%nu#Dn|BU#8<$ zYHC{gN+!o9ooxD;yO!Eyi%U&olB3mBqt={rk6cIj=bm%E=iKewdw%ctd!2{uIuF^s zInFJ_p`x;;sII=I!eVl+n_p9uS2(Y%vLLU_Vlw2Fm6lXg7Nb^BSzc9E)F8^;y!%L+ zS8Fl#%&)90v;N6#legFy?e1tOuPm%DE2^~!-;*9uCWqp@>Um|Q6$>mT`@Hh<(t=8h z$*#1bu&4niDl080s;Dix?{}hE%Zo%qbfEYs+6b+}IVKQVL{v-+m_=!fH?)b>F=L$C zZF}_4Uc$!YuD51_hy6-Rwnb|e*D+2ApV)y}trW42nigs{Q1c=+o2fBV^C~rM)NH55 zM9ofW34raVufE_&nah`LV*Hsv=B$ML{XS z8IIfq14Uh2qzI2UI%z$7VmCu!UhRC*7Uzy^kB5C#UIhX?=K{PT796xD4-bjm$>Ksn z1l$uY@jlpoALobd$vE%!1Y=KE97_B$#>d4hGu3Jdp6_UKC@n7;DLzdcEDk63hf(6Y zM6dpi$Rc9Gswzq>bddL>a6w64Vpx&~j269<2H9zQXra>+y|7n7lD~K<$wg!(4IWA( zM*V+hIb3)*5-WBlh2jOenB-@moa1_gk1;^l8@n?6&eJH%M z{Keh#!v*9diMLSy`EG{Tmy?8h@yYTN1E%DN`BSrsWl2QX;U$qJF_|=364ObOODZHy zmP9pavLu#}M)}g3N#oKZO%}&i)ap$=>naOsEzeqlwN|V5linaU?+>u%Piw>-Sg%Y! zN*g(I&DHwe%pw~kL*|?!$4_&e$+F4qi$dhpyIKSDZ{H6U@6B|y#uR=-af6CiQrwpL zk5k;4-LBTt+5&RKmpx13K>2nO`IYBU;5b9Y#u__4$qgK>Rn?csm0s&gBjnZ72-d~V zxYIzJ-*L5WT}TsJUs}{cj=&|qlNGqspZpPhlSCeD>`nzPM zZ0IBrvFREKk1aC9_SKtc>6h)m)oQ;@Zs+RuelowSKUFw!1K$ufqHF2XXqE zgd*(3-@t6Wb>f%}qJHA!=^Ogjy`Tc`-CX^9CkU|aID6Gr_#Cgo+gB?No_4V=y)Xh$ zakIXEv5J&_*3^G^*hr=P(#uRdzgW)7o-Y@(5`Vdol@nKNIrz%8Elk#YbApxnZ-dyi z=lWzOy>2A2GN-c_yBcnqnLK{0j+ITfUt#6RAD6K5^-qgfsrebW*UDcPGU zuvTS&GS)gB;1|Hs^gTXEe6h^4>p1jnPB2mA*2~Q6{hZ;_Xe!vWa9OQsAWJpPN!Dqa zjVw}xRG}!SJkva6ZKgSD&3aI8NCC6Hpf_Yspu#q5xqamWPj)wJ`43rz4_Ux?f<;;0PW=!3nt`hth<2*FJa(C$9st4mP9Nxt{73){L2*4m z1`iw}gXa!`1cd8`;Du?`&km9Mb`F%2qz6JG!j?dYfT0kFB6lcwp;*FVJ&OZUbX**Y1%S@+!yyPo#c&vgV(oCqlq)a| z5SeA3x*KME@qawl0)`HjN(+GK>10&_)W}x8UN(!!|U{5_P3i`<4qmeS$DM|)UjDi@P zbxD+5?VqFMbiYK&j7CSpDD;&_LkNnOqhSP!kD}!kevgJov?61qRT2YHXswHZP!vaF zWUg?>bl=Ct!gv(Tu}}pq`rTMOiIxs+B(&S%1L0mg%t0|L0Xv%Y7ZR`_+w_kU@M^c| z_Y&YFmgn)Y*hA=J67d{_zAO={ovDybo|DR}Dr@S*;qmUeTM|5t&gLZ8Z!evGV&E74 zx@LrS#P^KwG2=mE>}4(5`>>1m?C~cSy%W*&*X#(Nn~(Jfn)|}6qv~1oC0YKK2acp!371f z7<{V0R053_?K4HDbI9ik%wq7R0&^K$R)DG=kLIcZ#SFeypp?Ns6)0oyF9{m)v^9Df z;2__rka`B43N$ddslXxzw-i{y;0FbsWAKv#D;WH&z$yfdIPb5DG;zos1zH%~Q^3sN zcLf9k7;yjUYHJwSDzKJ;odW9_7zj+0&(4iZ9976x22KjR#-O(XyBIu1Kz^hg#;+*c zOg{>N0sEI(96AwV8SoPeyYgf2@P%Wfk9`8o4n9%q~cx;7O;8tK8MrE}mE;soS4 z*6sTlSqd+Z#rxkdH!5n+8)F>FXc=Nnk(SS+hn6cQd=YhY)(`&x1$$5JFF0cMl=-Oz!R> zM5X>zcNZb_(0>BQM(39QWLkK=fDELkK-jcJ~lM zPiA)yA@tB6EDtTA(JUFd=+BvlTxN;TQ_|f-2t6aZdkCS2{s4MpG(zZ^@sP(}p7;!X z$V`0Y7kb)E2$QchX029VIunBBW5%pKE&JBfh;#?!6_K;cbj#C_$0)b1QR=i>YG_WG zcX?;z^^p6O$&*WwL(447?aCw97E(959vYN%1kZw2JANVa7fYkP^uMNnz5dlae18}E IZ}Xt?e__I#j{pDw delta 5687 zcmZXYd0f@S7Qj7h7e!4_P@WL0fC;Ew6Mx<^$@+93v)_K%I0BzMR^sZvY?Yb zPF8<@I0yVA?N+aXN)t%)t2m9QdUs7fE|k)0gbF{ zd>Ayd*!Xy0tRy}F*05Fa!#$duJ9X05V`KH^&Dr4Nw%k6^uB~Q2V4RHwCiI=qC?YnB zrdc#Eh~|0GY!M9;%}b(LBbptev5IDwXk_oN32n7#-Vlw=VNH0{xn6YJV3_minv*Rl zZnEnJ!_xswTB7>~!=>qDi6I&c+stH%VHynSi^&p0H5j(6BFiSNBi;AQUc}%HhS77# z5))`Jyt-CeHCEPtcp5H>W%#r1ZsmDp1$k9=7L*#uw3O~FJIUDH>N>Bm5~0<(sxsfs zT9RVero?c#&km;qV7nnD7~9&EKoQK$$!0ef>|(B+c_z;^Xn){X+Mmpl80DeeM_ue< zej&?C=`F+4Q({n>I#U?>*}dVVE z%H#`8Y_=&BRD=B*wJUa0hnWA z2(ulJVbv3i%rq&4MV#?w*C%|)0w#p8yPpkUFv-YnXNR!g?`E-msK1H4&-h7f?&JwY zvgQZ7;x5XXe_Uv?=BElxPN_g>vgRvp z2(8f(@I(Oc|9C*CW6qR%sd+knp8B?~9hx@RNq9XS@iX7ind|Jkq;2!c)tw=8ZWCgM zrge417ksOSMi$lR5&PzL(X|s=neH7|YSy)n%v{$hEg$IG*a~MoV^)=}v9T4t+`}8X zJJ#1+)x)e$x$0qCYxPMuURk(EX9;yT^|*w^J#{vH$y#FjmL1id=a#q4|Kh4Xj-OXt z(D}tx>-D5%)A3GuPVkb{sXDja}u%Aw^*a&0S-b>E&KrRKYEa_#xyaUxx= z8%ddYqYJqf+_Vw-%dKisHvhbhlqY_zBjuah3rVT`9Vl!0AGJh!+*tr@{r&`2zbB6M znCyy&bRFArAk?wjo1?i|fIr<@S)*rcAeKa~`q^9`Po2-Kz=t6h7cdY_l7_(o*4w) zko)Zk?G4zdKIr`0DFrqam zM2d?GfA#%)5LtqeEZlPd8F(wp}sqa#r843}6S0MC2{$42bL-9kX4DQ=c2G8mT z$p|<0g9H?3`pM_+8YTzH41-jJO<|Bo{WZ`NT?xZiAo=J`1iZq*7Xu^1+a8?}Ulb00 z6u6EL4u`({U^w(9&_WSEhJ!zWO$34?+9F<*z|&ArQdM4>S8G@6QV;=gI9f}DT**rj zFdD7U{*Z_ww?Fu!s3Wn7#6c-qFZ2g9{si*$0T7O&YydolqGbSN%Nbbvuk%DakR|a6)i)qMazW@ z9Vq*pF;LFmHc*C~9ViRWZ;-Td20=EazdlIr%8fxXFKVz9=D{$;LoRzRR*06%E8?+m zhQ>fIXFHb5=Y!>j_KlH~dlCf&TSTx$1pD&nSm-W;kHyGfk60NzCKlpx)Vf$X+qYun zaQ}^!D;gIEL(ylBgGdw`;$R?(kK*Jb+>e78v|{3=RU8koXl;y#C=|!yj=CTZG6T^cn-ytk+2hmbrjr@?{e0(iQJS1Y5Z&&tPo-8P=(@?bSOiS zG8*qr8$UD}UPGacfh{Pyj)hN9Tp9~2Q4~H7OHh1AVqOL;M(cV8G`Yza9$>kX()s)2 zz#FgLi{o%@o49)>u4WTY%!FtZS|$voSWjN>1m65`CN4-5|27j>vWZ7!;Yu{|nk-18 zs32b94!!yLESTda-=rd#=j06DkPUvYn(xSl8S>wR7KY{UoL$@A8>>PT)7&_;cC=aA zE2tsZt-z}UURNNEI@qT`K7suT9H5l96{wUE_1Ymtwo=YJ3Oq*OT?L{D993Wtf%g@N zCGepF2?UNQFpR**1nTjLlZc#9A*lpDQD6jt(+Z3sa8`i~0-q|7Mc}*w69{~+z+?gH z?b<~}rc%fk3QQ;Pl>)N}TvkAwdfb|;3KS9eMu8Fn|5l)s!1of=<8Eu>Ho#7PR3SA4 zZYZ#Tz)b}f61b&69f4mIc$UCz1(p%`U4az{>T%pZ6ltK4I|?)rxTk=PzGh@EiegMpqDNP$7*3o>!omz#0V@fprRS0vi$pfo%fd8GTt0x}@>H zC~oJ@HXyfQcblcTl6#K=oeAt!pc{cV74Rg`s(?3vgKfa1`4TzYMl^o{M-&Jma8v-g z=?Q!w0Y5th#@9cAm4)9Us9Z&mO%n*8=E;s8fzTs9Oti}p2t6?!Jp!R8yQ4=SanxsY zbP0qW@hPL-Kmws>Tt|;U=qc{#5ePkz9X*lt#y?AD0s(CmtHdn*V3RoVFea=`L`nC`PF diff --git a/build/doctrees/notebooks/DemoNotebook_ammico.doctree b/build/doctrees/notebooks/DemoNotebook_ammico.doctree index 5ce59353702f8a68ee29c984b094055b6cc4d99a..b1007a2d32efc456344c2910da6db63865d08b2c 100644 GIT binary patch delta 541 zcmZ3!gM0Z7?hX6DD;TGlnV6d-rWm9o8=9D!CR!MqnHyLb8K)T zYLR4^Y?fp&{huS_GQz6f+-8)YuIbCDw4K$NF@RHxkg+uinYoGSsa94B(*>Ow<+soC zVa(`LFgGztHZnI(H8n^xFiEj6vPeoZO|&#fG&V^~G)XZx0XarzdORbO{bau%)&!Mr z2fE5mRf&+PH44-1Z!-pM2Zl=f<$6LvI3$ zWRjW)bdlNge~ygH2&;N?n^AtcrZ1z?c2;M`08TAJ#?~lg<|d}6T3IPf7j$Nn-#*WW zF{4kxB+W22E!85`(%3M?#N5<0$-+1(#lplg*~r|$I3?K-j?$nlq?

The detector modules

The different detector modules with their options are explained in more detail in this section. ## Text detector Text on the images can be extracted using the TextDetector class (text module). The text is initally extracted using the Google Cloud Vision API and then translated into English with googletrans. The translated text is cleaned of whitespace, linebreaks, and numbers using Python syntax and spaCy.

-

96ef38c32c264a8f92ececbbbc61dd96

+

3f6474ad0dc1455a83670823f11c9fbf

The user can set if the text should be further summarized, and analyzed for sentiment and named entity recognition, by setting the keyword analyse_text to True (the default is False). If set, the transformers pipeline is used for each of these tasks, with the default models as of 03/2023. Other models can be selected by setting the optional keyword model_names to a list of selected models, on for each task: model_names=["sshleifer/distilbart-cnn-12-6", "distilbert-base-uncased-finetuned-sst-2-english", "dbmdz/bert-large-cased-finetuned-conll03-english"] for summary, sentiment, and ner. To be even more specific, revision numbers can also be selected by specifying the optional keyword revision_numbers to a list of revision numbers for each model, for example revision_numbers=["a4f8f3e", "af0f99b", "f2482bf"].

Please note that for the Google Cloud Vision API (the TextDetector class) you need to set a key in order to process the images. This key is ideally set as an environment variable using for example

@@ -661,7 +661,7 @@ image_df.to_csv("/content/drive/MyDrive/misinformation-data/data_out.csv&qu

Image summary and query

The SummaryDetector can be used to generate image captions (summary) as well as visual question answering (VQA).

-

caa083c550474cfe92f335ccb4ea32c6

+

4f37866e73b34475a1ef88e1e8b1c6b0

This module is based on the LAVIS library. Since the models can be quite large, an initial object is created which will load the necessary models into RAM/VRAM and then use them in the analysis. The user can specify the type of analysis to be performed using the analysis_type keyword. Setting it to summary will generate a caption (summary), questions will prepare answers (VQA) to a list of questions as set by the user, summary_and_questions will do both. Note that the desired analysis type needs to be set here in the initialization of the detector object, and not when running the analysis for each image; the same holds true for the selected model.

The implemented models are listed below.

@@ -951,7 +951,7 @@ image_df.to_csv("/content/drive/MyDrive/misinformation-data/data_out.csv&qu

Detection of faces and facial expression analysis

Faces and facial expressions are detected and analyzed using the EmotionDetector class from the faces module. Initially, it is detected if faces are present on the image using RetinaFace, followed by analysis if face masks are worn (Face-Mask-Detection). The probabilistic detection of age, gender, race, and emotions is carried out with deepface, but only if the disclosure statement has been accepted (see above).

-

4f1efe8e931d4755b83bd849c2703dc1

+

744c273e50f04d828bf5a90a34fa4d74

Depending on the features found on the image, the face detection module returns a different analysis content: If no faces are found on the image, all further steps are skipped and the result "face": "No", "multiple_faces": "No", "no_faces": 0, "wears_mask": ["No"], "age": [None], "gender": [None], "race": [None], "emotion": [None], "emotion (category)": [None] is returned. If one or several faces are found, up to three faces are analyzed if they are partially concealed by a face mask. If yes, only age and gender are detected; if no, also race, emotion, and dominant emotion are detected. In case of the latter, the output could look like this: "face": "Yes", "multiple_faces": "Yes", "no_faces": 2, "wears_mask": ["No", "No"], "age": [27, 28], "gender": ["Man", "Man"], "race": ["asian", None], "emotion": ["angry", "neutral"], "emotion (category)": ["Negative", "Neutral"], where for the two faces that are detected (given by no_faces), some of the values are returned as a list with the first item for the first (largest) face and the second item for the second (smaller) face (for example, "emotion" returns a list ["angry", "neutral"] signifying the first face expressing anger, and the second face having a neutral expression).