From 985204502d17b5b81c1d283aa6c1287315fcd2ca Mon Sep 17 00:00:00 2001 From: iulusoy Date: Thu, 9 Oct 2025 07:29:16 +0000 Subject: [PATCH] =?UTF-8?q?Deploying=20to=20gh-pages=20from=20@=20ssciwr/A?= =?UTF-8?q?MMICO@3f9e855aebddf6eddaa81de1cd883bc5bcf5d3bc=20=F0=9F=9A=80?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- build/doctrees/environment.pickle | Bin 90221 -> 90221 bytes .../notebooks/DemoNotebook_ammico.doctree | Bin 187431 -> 187431 bytes build/html/notebooks/DemoNotebook_ammico.html | 6 +++--- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/build/doctrees/environment.pickle b/build/doctrees/environment.pickle index 1707f5bbeda7c61758b60c28331a5f6067d87421..43f74f87e704253bf1fe4eb83304ba80a4e98e66 100644 GIT binary patch delta 6291 zcmZvgd0dp`7RNmxpn_gOMY%}qiwel@!mgs|h$yDz28@8f>%a&DC6$hurojLs&*bBF zO>NJ#bPCMumSt8ZWocPg+t*u}S(%km_dL&e=lNvZf8KM>_jjKCyw5BbUhuf^g2$6n zhA!w=Yp$dIR4GKa12QnT6Qunn{{7#pg~N{kjuMSX)S!DvCR%v`g= zRI$=w>t9`CtgNmz);Bq9J`Gj%6~^)slex@ja{g=Ru3)w&#ecq!sk*G9)>2Vos;*t` zuz4D5YO2f34qLxv#yXvO)tJj0O%>R)x9Fk9mFrPmTP{ZNvs5&cG_EKqtFI`psBNe= znk;NhN+>&#k_)ZOCp9Xl#l24-Z3{kZ-mY%rYsww#9YqeUjTNQ_z$R9m8VojePio-k zc9lKoFXO2Ob|nwv?S(rm6YRW#HaYw=Y2mzk<@-&tseKTAl@8D#TVUQsU!!QE0{=3vJ%#~{x2yn@|n zzp`7?$FYoz;sF}P2wBg>WH;!`E@c+80T}~WXI71e=F=C2AsG>UK{)%;i&=c;Xwf~} zmXRc~-^d6SW?`l=FaT3*UgbthRs4!3M+2U!U2adiBZjTc31EL@4`hZMKjxj|&rYEK zb+$j-obAha&PaA3dnh}bJ=j;am+nG=c?_FhgHmdVwerC?{pxe?I8?#f;3Q;18T(X}Gx zJE<;MR1VJn{}nf`u8r-U6ptO8nv~}$>#3CmPA-KO=N*#+0ot5frmXYCv4RZK!S1Xy z?016>r~fQp!1IeK>}F5R31=f_GiYNWbHdmgv*)nOh2hM8GKn?J$zicY;VkCuVeC8f zbLWP$!lH7vcXl|t_U;%4b90yz^()tAv4iLb%bw=4svGAllVzXifu}6XeunU5+215Q zIn8q6$+E8#o-F&-!V`6WSLE&ozaja}BLXB?~= zpnKg~t**{A)#%=d8m@cw=Ht3Iz0O^4vBaW#Yk>!exQ4~yUyrd)SmOYH{^kj zbkF@Em-o}dAM5P%N8i`IH}<;eJ%6%qtIk@Uct!Wh4*aQm7H6azJN#BH7VKQMpfjJX ze>;M`+<6>}H}vF4_TkzBHuy#lEMi~geJG!mHu|zPXNNd%Jv0h1@N*t~=1y|*p4~~# z%#9mFzEcC+1`5iGv@ zMcCxL)cv{}evCYKs)w0Rr?DU2x|hv36Tu#R`#9Tp&5SQm1amu+&p-A6IYCx?rW4wo zk?&LjYv~f5&BYhY!gf95>s)hoETH4(Jo;WeIeGu_cax;%{5B#1AKpdI{*Ue;C*u<< zIo+SSQ}U;u?;*1J%WiTSzm6tp|2H#;4EQdGoW&OgkhJomoyb2gHITFG$NR~d_w#CU zzPNllIrUe7TCM-BiOArqD}mYG4N@m_<=SR;>T^1k13mtLfuHmJKaA3OdM3=1PL~q^ zKR;3H`rQqx$ZO~eCh~stfEx1ZykI$b-}i$nz(P)np#oUx;R!5Yb`gKb8}iwaQ*u1+ z=L6>_hgARLDBf>0QQd~vAkxh)hXVf0og zOsA|_JU0YeKNt%B^1kb*9rN$g2d`^h5(cv|^~o?f`o%EF!6+eIn!0e9h|%NWa?E$a zAps-L2*B@cJU;?5sc#c6h=3UWSO^S8ek=ka(R>pjlLI4V@{&l%M7TW?($KseDc9{2 zB?l>pf^3AXQIJmkHPebdjl%D4a_OxE{GuTM6Jw)YOJ~GaMZ-u++{_cBA&?)AhT#OZ zQpUH@5JX@Hfv^}?#(fg_dX-mOR+x-U4z(}kF_40zZH@j z$?PC=SemZ)M&X-);?yw^jiz=Cj7GC{3>3;4*ha8IQvmmhm0OS-D>r0itQ_RoSUHG$ z9E`&z>2Z*OW@VhrJQyeU^Rqa4Okwdb1+~g}+3vA;neutO+{lQrvd=|h<^1hqWyfa?WI^DS}Ctf#ljh%1Vh~&ST64;$^#8d zl9Rg`4JDgJvRNbt@c3jHB$HoDlF8o5GC4mPZxGzIWI5YI$#S?~ljV-4q`-L8YEmE; z&9)R6i{?a%T*99zkc3fEs*EaAAsM6XsSt=@zR_h534X*I04(W@z*Ed=xzM!1Sm&SIuSUUUnjzL zG>&w@Pd2_N19s!Z(~|*<(JaiwxoqMaGw~g_@iUqDw%B-2CeE{+CuZT9w)3VeD8l#u zWEM_|@rl_88DE6XQP9;Y~E>r^7}x6*FKhny<+$H^3TpSW|U@|S1A zFtGD;Gx6~3yk7wxmz}2<0Dc4JS^S!oWDm7@UV+gBjw%pG;3WkT2)wL7 zGJ#hVNF(r?0uuG2>e$9E1tF{t{d#+Tb0sC z;DQ1x30zd*b^@0aSWVz31@0tpS%Gx~t|+hpffdL7O_63wxvD@bfgT0y1pZWj5r7eT z>1vw^xGS)gfQJG*2zUuV^?x6c{wn1`0^SNdLSUc*dkOqQ0Qn2>DE_F$#rU~^e20dL zNd7E*o#f#zAmxA&qW+}*Cg~$xx^|jCumWcZgeh>2K%@d65r|RXGXi52_=-Th0@iPd zBr0-|K#BrC5f~?c{9QX9JzQzJ3&^Epx*{!$$kT|hQ3az)x)lIJUM6M=;aEFw^>Kna1RXsk3w8IffwrILWAz;Xh$ z3YZDhD_|jTivqV2xJ`lE39MGYD$XCba;+kFQOey4Y#`9AKnsC31=YIEYkv%HqegY2(K%Y+d-vmzeQ5TTMu-6r79whHqpg(~F3JfIhlmfm8 zim}TsMTSwzVFdyRJf}bqffp1ACGe5}^btqkxCH$4e3)&WhXsg#FOe=@R=Xw;4n3o{ zA`mL#KdQgA5eSv6-ikn|jP0!mB<{G%-jYD5+}v9c2$j^{ia@AL>#YccN^Wl@*Gf`s zZ%H6*WbCa7gvz4cia@9o^;QHzrTp)T)h-!I>3^5(5}{JuTM-DA@x2v+P)X{o2!zU% z>l9CU;!}9!0{k(>xM2Z)mty=jG*)rJ*tJ_^t449$V)9y(c!+h%!;wpsIdXS3v1xMJ zvJ&K3$ZC&_^Pc;@_sR#3 zS3YpuF~uXzx!h2@q^5GI#pFhj9E#>(QF>bk|4Tx+OkuBmLanA{sH zn~Iy46qh$tR#etCRhQNvbSkZ_tu8lMOs-jBmR@YUCnGIf8 zS}(h0hhir|CmRws#&?zdph4OiyqH}3)ri^(%Sy|1i^f=foHul{>Nr1WXV1p@+-FkR zLw`4(YS5+R;k+v!Je_1W9qjwKI50E+c%Q;fk+Du(-25j3O~qoAdfMip(;4Fg;23A_i}BKDve?F@ZMc>Z3AhHnFjZ8MvN?#N}>| ziz^#SabxTo8_F&0k>nT_nVjzQBj#&2*tw(xwlie`8=B(6+ERiX?jUBmr38eCml701Twec4#OCmX4EV?AlZSpnM1X*$Nz+}Z84VeFMOFNkFir3FGf z1HBhpuJ?iotWY1s*67`wut=+-BH7-wFncqW#4ddR7Q>;MJr&sN%ekWlW`{hw{}Hm8zx0y2gfI6ILX53WQ_(GQ?k)L^oZzw>(qa4NxLV^X zz9AaEu!&k)7rBsgRI4Q^x2A@i!?ml)X*3)lC%fK`l8YM2VcGQ~xa)8jYF*QOiE=s~ zb`Uup)=e$fiEMfFF>002`Y$30%iM{~TVWvrPku6NzW_C4!mz5euBqJ7VDYOvB& zLF87)bL4F7`iPu|SKTD1mDN%5ug~ox(#vfF_pV(^((D&5lQeUEA32j=I!R8*#&hI& zymFo#hgWURjm;N`e7(gsv140-I=r}T4N0aq50X>y*01C=S%dAE{}EqyeuXzH*)x%O z7ffe!c4e`>$K}Sd1G^3~)8|6ZWy5AVVwvig>+URe{43h{hj)(!4BV|d-(5~l#(V3@ zxw*IAPD)GP$ITKs{`(&!r|m-zl6HLj0g)94KOtxSC;uj==+h5@e|-=5vXmpZNUjjf zB6>fBZtJz)!*(LpvEJiX*qV3h@Htt)HtY&!TTUEgdp?V2#_h3kviZ}4ncb-@w(>+U zt2?y|OxDq-tAIUoTl6^tAEieA4@b~hmz^08!gIIoKigpQGXCaiN36E*c^mUS|Afuk ze&I2jm;9yC=Jj5*x3#$V)kYgz{Y|gUYyK|8rf&Z}+s5wsL2vWsU%AJoF1>2Du|HpH zvUx9Ef7#~Ey3uO$zP|aW&1?7xY;9Nm(qd!7ZY>37IBCEmVl~S?6>NR$jwhg_v!4C6 zOgiu6zys3h*$EI&-Y<4gOxA3~Ae_J;)GllWUf;K6ePU?}qW0kY@L z0Kiut{#gJ_qpUoh?vJfE1%RhK?*?lhdkz|eM>H=EgaS<65hzE$8VGufB7>x<4}wID z-UyOoo(_UYjGTfY9Zgm+q)^{AJU19Z`Ro2L4Ee{wFdEJG!7|xrv`k(w8d4Ch8;w=e z$xn=y_dR5c93*!Pq#^7W14-0h8{N^xG5BgFf2TSKxQBo@CWeLd-#R1ycnFN7#ML}1 z1bq0O5bz?fmNG7ffG>d;2n2@qXS^tZ&bgwxaY;>Si$$$VMJUALXlp~|O7?}qRE&bh zLK2#ivH05A$y>?1KxU6LJ!i+_Q-NaUI0!*gHxBMYvvwTh%NdwPu>2Zt?i?nUAU#Yj z$kH%5$a`UO5c_aQz$QuIkc?(&xXj!cF4yx?xZI||2$+IeRfKH!dW1~*Dnc$~@Oath zyzz4W=J7J+*KI+CPagW zy#>qVizvCFKGAY=52B%DgGe@rWN#i314Cu_w!q7Y)3**sz8Y^$%w^)eAC^}9?RdEo5(YiPYM{_7nt`**~sP`%Hkcp-} z9+to=emfp_VingCz#kZIC-X)Ej>q_S3DAP3Y=Sg>6X0=-=1#iASa4 zo|tUCZ6a$&h z`5>;PLpS0(>2QkhNS6AgFTXJf79yTE8JCJcFh(8&;tCj?Cg5SccILInd!G z-~E7<%*f;i?uX&f#n0W3o7cq$=i+vC@uXb9w_dL0!UW3Iaibj!=lgPTdAj&Fxwutb zJUkCqqKh}@K?Y?7@cO~v#n0qHnUj2}3}w2RIlL_&MnVVQoDcKlXOK1q>+LCvw#f_2 zPE_lnSd2FJN7`%XA=s+G8wB1|AcH#CsX#e_cNEw~P2N?YL1q}Wy^3t4Ht#ENAA$V} zgcJBkfk*-e6o?`4R|Vn;98zE+fx`rhc*n^^dR0mqfny3xBJi04Sp-fhkVD|K0(k_^ zC{Re?3k7BhV6IE;#k40xun2C0$(doP2gJvY6$#O0weCW zCXO5ICU8Z8r39`j@F;<63bYdVQGw+IZYuC3fu9t38i5hV{Y8;BO1YZ1{Z zm_NSdU&Tv{XpJIN`IuKlAJDp{NnvDbugtL$oxzZ zne4^=Po)OgL~N(e{UQ>lu9i!3K|heqr}szN43ZyEU^aod3d|!=q(Cu&GBidSqnyY> zl~P4OQ(!THIt2^_8Wd^{M+vklU=;h0OSwXkCn)791)e6*ra(J^P6fIM zbSuCJtX6;%cwT{Z1lErjWV9{vi$pf6l$Qy-DggaD;r|gh)h+!%Zo}68NOL54y8!lj0+nm34}`0Kt&)_#9w27n6^Nuqz+UB zLZxD$Qeh;ibf6>s%(lE(5HO17- z+`=#^Ini|bKS#!8gjKz{%_u)z)0a_cJF7Ee0H+oqV`~&La}(23t*jKL3pz8(Z=dJG zn9-+Tm||{}Y-*U8nrLZklA4y9Xl$62W@uuWW@KP)VrHBMa*WROct$4s$$meq2`b+X zbd{Z|5+PG-6sFtXW(?X643+lF^@M_ON*0d*149qNAeoX?CI(VR$o45&6O|YkN_#kp QQq%GhbGI+K%GAsT00k4IBLDyZ delta 541 zcmZ3!gM0Z7?hX6DDOw<+soC zVa(`LNJ>dfv@kVGG&eR)Hc3e|NiwoXPEJcoF;7ZNOH4{M0XarzdORbO{bau%)&!Mr z2fE5mRf&+PH44-1Z!-pM2Zl=f<$6LvI3

The detector modules

The different detector modules with their options are explained in more detail in this section. ## Text detector Text on the images can be extracted using the TextDetector class (text module). The text is initally extracted using the Google Cloud Vision API and then translated into English with googletrans. The translated text is cleaned of whitespace, linebreaks, and numbers using Python syntax and spaCy.

-

b609834d8d574693a769a9c09fbf51d5

+

fa9661e35a864a6989f950e5186bb570

The user can set if the text should be further summarized, and analyzed for sentiment and named entity recognition, by setting the keyword analyse_text to True (the default is False). If set, the transformers pipeline is used for each of these tasks, with the default models as of 03/2023. Other models can be selected by setting the optional keyword model_names to a list of selected models, on for each task: model_names=["sshleifer/distilbart-cnn-12-6", "distilbert-base-uncased-finetuned-sst-2-english", "dbmdz/bert-large-cased-finetuned-conll03-english"] for summary, sentiment, and ner. To be even more specific, revision numbers can also be selected by specifying the optional keyword revision_numbers to a list of revision numbers for each model, for example revision_numbers=["a4f8f3e", "af0f99b", "f2482bf"].

Please note that for the Google Cloud Vision API (the TextDetector class) you need to set a key in order to process the images. This key is ideally set as an environment variable using for example

@@ -661,7 +661,7 @@ image_df.to_csv("/content/drive/MyDrive/misinformation-data/data_out.csv&qu

Image summary and query

The SummaryDetector can be used to generate image captions (summary) as well as visual question answering (VQA).

-

e371da3e91ab41baa12a67f67b8de248

+

c1bb5284d8da452db91b3ed56781bca5

This module is based on the LAVIS library. Since the models can be quite large, an initial object is created which will load the necessary models into RAM/VRAM and then use them in the analysis. The user can specify the type of analysis to be performed using the analysis_type keyword. Setting it to summary will generate a caption (summary), questions will prepare answers (VQA) to a list of questions as set by the user, summary_and_questions will do both. Note that the desired analysis type needs to be set here in the initialization of the detector object, and not when running the analysis for each image; the same holds true for the selected model.

The implemented models are listed below.

@@ -951,7 +951,7 @@ image_df.to_csv("/content/drive/MyDrive/misinformation-data/data_out.csv&qu

Detection of faces and facial expression analysis

Faces and facial expressions are detected and analyzed using the EmotionDetector class from the faces module. Initially, it is detected if faces are present on the image using RetinaFace, followed by analysis if face masks are worn (Face-Mask-Detection). The probabilistic detection of age, gender, race, and emotions is carried out with deepface, but only if the disclosure statement has been accepted (see above).

-

bdea856a735c4da4b28ccfbd7bafaba4

+

1d72c51aea934efea31bf149f207463f

Depending on the features found on the image, the face detection module returns a different analysis content: If no faces are found on the image, all further steps are skipped and the result "face": "No", "multiple_faces": "No", "no_faces": 0, "wears_mask": ["No"], "age": [None], "gender": [None], "race": [None], "emotion": [None], "emotion (category)": [None] is returned. If one or several faces are found, up to three faces are analyzed if they are partially concealed by a face mask. If yes, only age and gender are detected; if no, also race, emotion, and dominant emotion are detected. In case of the latter, the output could look like this: "face": "Yes", "multiple_faces": "Yes", "no_faces": 2, "wears_mask": ["No", "No"], "age": [27, 28], "gender": ["Man", "Man"], "race": ["asian", None], "emotion": ["angry", "neutral"], "emotion (category)": ["Negative", "Neutral"], where for the two faces that are detected (given by no_faces), some of the values are returned as a list with the first item for the first (largest) face and the second item for the second (smaller) face (for example, "emotion" returns a list ["angry", "neutral"] signifying the first face expressing anger, and the second face having a neutral expression).