09wk-2: model의 입력파악, model의 사용연습

Author

최규빈

Published

November 9, 2024

1. 강의영상

!pip uninstall mp2024pkg -y 
!pip install git+https://github.com/guebin/mp2024pkg.git
Found existing installation: mp2024pkg 1.0
Uninstalling mp2024pkg-1.0:
  Successfully uninstalled mp2024pkg-1.0
Collecting git+https://github.com/guebin/mp2024pkg.git
  Cloning https://github.com/guebin/mp2024pkg.git to /tmp/pip-req-build-178x45er
  Running command git clone --filter=blob:none --quiet https://github.com/guebin/mp2024pkg.git /tmp/pip-req-build-178x45er
  Resolved https://github.com/guebin/mp2024pkg.git to commit 2adb5fc04e589895edd76ec5da87aa37bf0dd70b
  Preparing metadata (setup.py) ... done
Building wheels for collected packages: mp2024pkg
  Building wheel for mp2024pkg (setup.py) ... done
  Created wheel for mp2024pkg: filename=mp2024pkg-1.0-py3-none-any.whl size=4588 sha256=84356f47d6566dcda64e0d93070d9a1d9b6d9c9f962769858a4d823b589fe40b
  Stored in directory: /tmp/pip-ephem-wheel-cache-n0_tsfan/wheels/83/81/b1/63c03ca869bb1a35a588dce54e83ea32c6c21e6af0bbeaa640
Successfully built mp2024pkg
Installing collected packages: mp2024pkg
Successfully installed mp2024pkg-1.0

2. Imports

import transformers
import datasets
import huggingface_hub
import torch
import torchvision
import pytorchvideo.data
import PIL
import tarfile
import mp2024pkg as mp
/home/cgb3/anaconda3/envs/hf/lib/python3.12/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html
  from .autonotebook import tqdm as notebook_tqdm

3. Model 입력파악

- 아래중 하나의 방법순으로..

  • 방법1: model.forward? 에서 시그니처를 확인
  • 방법2: model.forward? 에서 사용예제를 확인
  • 방법3: 인터넷을 활용한 외부 자료 확인 (공식문서, 공식튜토리얼, 신뢰할만한 블로그, ChatGPT등)
  • 방법4: model.forward?? 를 보고 모든 코드를 뜯어봄 <— 하지마세요

- 모델의 입력이 어떤형태로 정리되어야 하는지 알아내는 확실한 방법은 없음

  • 방법1,2,3 은 다른사람의 호의에 기대해야함.
  • 방법4는 사실상 불가능

# 예제1 – 텍스트분류

model1 = transformers.AutoModelForSequenceClassification.from_pretrained(
    "distilbert/distilbert-base-uncased", num_labels=2
)
Some weights of DistilBertForSequenceClassification were not initialized from the model checkpoint at distilbert/distilbert-base-uncased and are newly initialized: ['classifier.bias', 'classifier.weight', 'pre_classifier.bias', 'pre_classifier.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

모델의 기본정보(config)

model1.config
DistilBertConfig {
  "_attn_implementation_autoset": true,
  "_name_or_path": "distilbert/distilbert-base-uncased",
  "activation": "gelu",
  "architectures": [
    "DistilBertForMaskedLM"
  ],
  "attention_dropout": 0.1,
  "dim": 768,
  "dropout": 0.1,
  "hidden_dim": 3072,
  "initializer_range": 0.02,
  "max_position_embeddings": 512,
  "model_type": "distilbert",
  "n_heads": 12,
  "n_layers": 6,
  "pad_token_id": 0,
  "qa_dropout": 0.1,
  "seq_classif_dropout": 0.2,
  "sinusoidal_pos_embds": false,
  "tie_weights_": true,
  "transformers_version": "4.46.2",
  "vocab_size": 30522
}
  • max_position_embeddings: 512

모델의 입력파악

# model1.forward? 
# -- 결과가 너무 길어서 생략

사용예시1 – 입력나열, loss O

model1(
    input_ids = torch.tensor([[1,2,3,4], [2,3,4,5]]),
    labels = torch.tensor([0,0])
)
SequenceClassifierOutput(loss=tensor(0.7676, grad_fn=<NllLossBackward0>), logits=tensor([[-0.0565,  0.0915],
        [-0.0557,  0.0837]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시2 – **딕셔너리, loss O

model1_input = dict(
    input_ids = torch.tensor([[1,2,3,4], [2,3,4,5]]),
    labels = torch.tensor([0,0])
)
model1(**model1_input)
SequenceClassifierOutput(loss=tensor(0.7676, grad_fn=<NllLossBackward0>), logits=tensor([[-0.0565,  0.0915],
        [-0.0557,  0.0837]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시3 – 입력나열, loss X

model1(
    input_ids = torch.tensor([[1,2,3,4], [2,3,4,5]])
)
SequenceClassifierOutput(loss=None, logits=tensor([[-0.0565,  0.0915],
        [-0.0557,  0.0837]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시4 – **딕셔너리, loss X

model1_input = dict(
    input_ids = torch.tensor([[1,2,3,4], [2,3,4,5]])
)
model1(**model1_input)
SequenceClassifierOutput(loss=None, logits=tensor([[-0.0565,  0.0915],
        [-0.0557,  0.0837]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시5 – 초간단, loss X

model1(torch.tensor([[1,2,3,4], [2,3,4,5]]))
SequenceClassifierOutput(loss=None, logits=tensor([[-0.0565,  0.0915],
        [-0.0557,  0.0837]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시1~5에서 model1() 대신에 model1.forward()를 사용해도 된다.

#

# 예제2 – 이미지분류

model2 = transformers.AutoModelForImageClassification.from_pretrained(
    "google/vit-base-patch16-224-in21k",
    num_labels=3 # 그냥 대충 3이라고 했음.. 별 이유는 없음
)
Some weights of ViTForImageClassification were not initialized from the model checkpoint at google/vit-base-patch16-224-in21k and are newly initialized: ['classifier.bias', 'classifier.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

모델의 기본정보(config)

model2.config
ViTConfig {
  "_attn_implementation_autoset": true,
  "_name_or_path": "google/vit-base-patch16-224-in21k",
  "architectures": [
    "ViTModel"
  ],
  "attention_probs_dropout_prob": 0.0,
  "encoder_stride": 16,
  "hidden_act": "gelu",
  "hidden_dropout_prob": 0.0,
  "hidden_size": 768,
  "id2label": {
    "0": "LABEL_0",
    "1": "LABEL_1",
    "2": "LABEL_2"
  },
  "image_size": 224,
  "initializer_range": 0.02,
  "intermediate_size": 3072,
  "label2id": {
    "LABEL_0": 0,
    "LABEL_1": 1,
    "LABEL_2": 2
  },
  "layer_norm_eps": 1e-12,
  "model_type": "vit",
  "num_attention_heads": 12,
  "num_channels": 3,
  "num_hidden_layers": 12,
  "patch_size": 16,
  "qkv_bias": true,
  "transformers_version": "4.46.2"
}
  • image_size: 224
# mp.tab(model2)
# -- 결과가 너무 길어서 생략 
model2.config.num_channels
3

모델의 입력파악

# model2.forward?
# -- 결과가 너무 길어서 생략
torch.randn(2,3,64,64)
tensor([[[[ 0.3649, -1.2605,  0.2920,  ..., -0.9787, -1.4717,  0.4677],
          [ 0.4702, -0.2664,  2.0551,  ..., -1.4783,  2.6271, -0.7880],
          [ 1.0891,  0.1022,  0.0211,  ...,  0.5663,  0.9881, -0.2273],
          ...,
          [ 0.2803,  1.3142,  0.4706,  ..., -1.1566,  0.1984,  1.4354],
          [ 0.2090,  0.6016,  0.5450,  ..., -0.1247,  0.8066, -0.2222],
          [ 1.1755,  1.1512,  0.7503,  ..., -1.2212,  0.2055,  0.3130]],

         [[ 0.0515, -0.7000, -1.4944,  ...,  0.5087,  0.7801, -0.0497],
          [-0.6095,  0.6269, -0.2168,  ..., -1.0991, -1.9875,  0.6154],
          [ 0.3600,  0.1016, -2.0896,  ..., -2.1648, -0.7579, -0.0747],
          ...,
          [-1.4642, -1.3562, -0.9592,  ..., -0.8820,  2.6860, -1.0333],
          [-1.6126,  0.2651,  0.3171,  ...,  0.4368, -0.2348, -0.5822],
          [ 0.6224,  0.8832,  0.4589,  ..., -1.3447,  0.5074, -1.2563]],

         [[-0.1654,  0.4390, -0.3119,  ..., -0.7779,  1.6596, -1.8286],
          [ 0.6203, -0.0072,  0.5610,  ...,  1.1161,  1.4232, -1.1180],
          [-0.3927,  2.7406,  0.6024,  ...,  0.2374,  0.8350, -1.3866],
          ...,
          [-1.0612,  0.6898,  0.5559,  ..., -1.7000, -0.3696, -0.7335],
          [-0.3483,  1.0322, -0.3756,  ..., -0.6278, -1.5500,  1.3041],
          [ 0.3224,  1.6797,  0.8219,  ...,  0.7146, -0.0869,  0.9379]]],


        [[[ 0.0124, -0.9813,  0.4385,  ...,  0.0203,  0.1493,  0.0797],
          [ 0.6513,  0.8768, -0.3780,  ...,  0.9154, -0.1878,  1.2205],
          [ 0.1238, -0.1669,  0.1737,  ...,  0.8929,  0.4685,  0.4024],
          ...,
          [-0.4388,  0.1257, -0.3490,  ..., -1.1260, -1.2975, -0.8466],
          [-0.5190, -0.1010,  1.6575,  ...,  0.7269, -1.4265, -1.0750],
          [ 1.9482, -0.7578, -0.3996,  ..., -0.2626, -1.3259,  0.5262]],

         [[ 1.9872, -1.0255,  0.1767,  ...,  0.6125, -0.5061,  0.8420],
          [ 0.7715, -0.5411, -1.4627,  ..., -2.6088, -0.5574,  0.5872],
          [ 1.3500, -0.6974, -1.0678,  ...,  0.3034, -0.1847, -0.5619],
          ...,
          [ 1.4522, -0.4482, -0.8592,  ...,  0.4603,  0.6203, -0.2643],
          [-0.1335,  1.5470, -0.4339,  ..., -1.5276, -0.9974,  0.1557],
          [ 0.4852, -0.0337, -0.1306,  ..., -2.4541, -1.3687, -0.6767]],

         [[-0.9349,  1.5542, -0.8323,  ...,  0.6367,  0.8844,  1.3362],
          [-0.5194, -1.8781, -0.4988,  ..., -1.5334,  0.5765,  0.0722],
          [-1.5885, -2.2619,  0.5158,  ...,  0.6815, -0.0635, -0.0362],
          ...,
          [ 0.5923,  0.4383, -1.0135,  ..., -0.3842,  0.3017, -2.5947],
          [-0.7695,  0.5169, -0.4380,  ...,  3.1244, -0.3718,  0.8871],
          [-1.9921,  0.0219, -0.1598,  ..., -0.2551,  0.3711,  0.4163]]]])

사용예시1 – 입력나열, loss O

torch.random.manual_seed(42)
model2(
    pixel_values = torch.randn(2,3,224,224),
    labels = torch.tensor([0,1])
)
ImageClassifierOutput(loss=tensor(1.1627, grad_fn=<NllLossBackward0>), logits=tensor([[-0.1805,  0.0750,  0.0782],
        [-0.1494,  0.0548,  0.0913]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시2 – **딕셔너리, loss O

torch.random.manual_seed(42)
model2_input = dict(
    pixel_values = torch.randn(2,3,224,224),
    labels = torch.tensor([0,1])
)
model2(**model2_input)
ImageClassifierOutput(loss=tensor(1.1627, grad_fn=<NllLossBackward0>), logits=tensor([[-0.1805,  0.0750,  0.0782],
        [-0.1494,  0.0548,  0.0913]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시3 – 입력나열, loss X

torch.random.manual_seed(42)
model2(
    pixel_values = torch.randn(2,3,224,224)
)
ImageClassifierOutput(loss=None, logits=tensor([[-0.1805,  0.0750,  0.0782],
        [-0.1494,  0.0548,  0.0913]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시4 – **딕셔너리, loss X

torch.random.manual_seed(42)
model2_input = dict(
    pixel_values = torch.randn(2,3,224,224),
)
model2(**model2_input)
ImageClassifierOutput(loss=None, logits=tensor([[-0.1805,  0.0750,  0.0782],
        [-0.1494,  0.0548,  0.0913]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시5 – 초간단, loss X

torch.random.manual_seed(42)
model2(torch.randn(2,3,224,224))
ImageClassifierOutput(loss=None, logits=tensor([[-0.1805,  0.0750,  0.0782],
        [-0.1494,  0.0548,  0.0913]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

#

# 예제3 – 동영상분류

model3 = transformers.VideoMAEForVideoClassification.from_pretrained(
    "MCG-NJU/videomae-base",
)
Some weights of VideoMAEForVideoClassification were not initialized from the model checkpoint at MCG-NJU/videomae-base and are newly initialized: ['classifier.bias', 'classifier.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

모델의 기본정보(config)

model3.config
VideoMAEConfig {
  "_attn_implementation_autoset": true,
  "_name_or_path": "MCG-NJU/videomae-base",
  "architectures": [
    "VideoMAEForPreTraining"
  ],
  "attention_probs_dropout_prob": 0.0,
  "decoder_hidden_size": 384,
  "decoder_intermediate_size": 1536,
  "decoder_num_attention_heads": 6,
  "decoder_num_hidden_layers": 4,
  "hidden_act": "gelu",
  "hidden_dropout_prob": 0.0,
  "hidden_size": 768,
  "image_size": 224,
  "initializer_range": 0.02,
  "intermediate_size": 3072,
  "layer_norm_eps": 1e-12,
  "model_type": "videomae",
  "norm_pix_loss": true,
  "num_attention_heads": 12,
  "num_channels": 3,
  "num_frames": 16,
  "num_hidden_layers": 12,
  "patch_size": 16,
  "qkv_bias": true,
  "torch_dtype": "float32",
  "transformers_version": "4.46.2",
  "tubelet_size": 2,
  "use_mean_pooling": false
}
  • image_size: 224
  • num_frames: 16

모델의 입력파악

# model3.forward?
# -- 결과가 너무 길어서 생략

사용예시1 – 입력나열, loss O

torch.random.manual_seed(42)
model3(
    pixel_values = torch.randn(4,16,3,224,224),
    labels = torch.tensor([0,1,0,1])
)
ImageClassifierOutput(loss=tensor(0.7254, grad_fn=<NllLossBackward0>), logits=tensor([[-0.7084, -0.2919],
        [-0.5846, -0.1854],
        [-0.6635, -0.2402],
        [-0.7193, -0.3811]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시2 – **딕셔너리, loss O

torch.random.manual_seed(42)
model3_input = dict(
    pixel_values = torch.randn(4,16,3,224,224),
    labels = torch.tensor([0,1,0,1])
)
model3(**model3_input)
ImageClassifierOutput(loss=tensor(0.7254, grad_fn=<NllLossBackward0>), logits=tensor([[-0.7084, -0.2919],
        [-0.5846, -0.1854],
        [-0.6635, -0.2402],
        [-0.7193, -0.3811]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시3 – 입력나열, loss X

torch.random.manual_seed(42)
model3(
    pixel_values = torch.randn(4,16,3,224,224)
)
ImageClassifierOutput(loss=None, logits=tensor([[-0.7084, -0.2919],
        [-0.5846, -0.1854],
        [-0.6635, -0.2402],
        [-0.7193, -0.3811]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시4 – **딕셔너리, loss X

torch.random.manual_seed(42)
model3_input = dict(
    pixel_values = torch.randn(4,16,3,224,224),
)
model3(**model3_input)
ImageClassifierOutput(loss=None, logits=tensor([[-0.7084, -0.2919],
        [-0.5846, -0.1854],
        [-0.6635, -0.2402],
        [-0.7193, -0.3811]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

사용예시5 – 초간단, loss X

torch.random.manual_seed(42)
model3(torch.randn(4,16,3,224,224))
ImageClassifierOutput(loss=None, logits=tensor([[-0.7084, -0.2919],
        [-0.5846, -0.1854],
        [-0.6635, -0.2402],
        [-0.7193, -0.3811]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

#

4. Model 사용 연습

A. 텍스트

model1 = transformers.AutoModelForSequenceClassification.from_pretrained(
    "distilbert/distilbert-base-uncased", num_labels=2
)
tokenizer = transformers.AutoTokenizer.from_pretrained("distilbert/distilbert-base-uncased")
Some weights of DistilBertForSequenceClassification were not initialized from the model checkpoint at distilbert/distilbert-base-uncased and are newly initialized: ['classifier.bias', 'classifier.weight', 'pre_classifier.bias', 'pre_classifier.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

# 예제1 – imdb

imdb = datasets.load_dataset('imdb')
d = imdb['train'].select(range(3))
d
Dataset({
    features: ['text', 'label'],
    num_rows: 3
})

(풀이1)

실패

model1.forward(torch.tensor(tokenizer(d['text'])['input_ids']))
ValueError: expected sequence of length 363 at dim 1 (got 304)

원인분석

mp.show_list(
    tokenizer(d['text'])['input_ids']
)
List Overview:
Total items: 3

1. list[0]
   - Type: list
   - Length: 363
   - Value: [101, 1045, 12524, 1045, 2572, 8025, 1011, 3756, 2013, 2026, 2678, 3573, 2138, 1997, 2035, 1996, 6704, 2008, 5129, 2009, 2043, 2009, 2001, 2034, 2207, 1999, 3476, 1012, 1045, 2036, 2657, 2008, 2012, 2034, 2009, 2001, 8243, 2011, 1057, 1012, 1055, 1012, 8205, 2065, 2009, 2412, 2699, 2000, 4607, 2023, 2406, 1010, 3568, 2108, 1037, 5470, 1997, 3152, 2641, 1000, 6801, 1000, 1045, 2428, 2018, 2000, 2156, 2023, 2005, 2870, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 1996, 5436, 2003, 8857, 2105, 1037, 2402, 4467, 3689, 3076, 2315, 14229, 2040, 4122, 2000, 4553, 2673, 2016, 2064, 2055, 2166, 1012, 1999, 3327, 2016, 4122, 2000, 3579, 2014, 3086, 2015, 2000, 2437, 2070, 4066, 1997, 4516, 2006, 2054, 1996, 2779, 25430, 14728, 2245, 2055, 3056, 2576, 3314, 2107, 2004, 1996, 5148, 2162, 1998, 2679, 3314, 1999, 1996, 2142, 2163, 1012, 1999, 2090, 4851, 8801, 1998, 6623, 7939, 4697, 3619, 1997, 8947, 2055, 2037, 10740, 2006, 4331, 1010, 2016, 2038, 3348, 2007, 2014, 3689, 3836, 1010, 19846, 1010, 1998, 2496, 2273, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 2054, 8563, 2033, 2055, 1045, 2572, 8025, 1011, 3756, 2003, 2008, 2871, 2086, 3283, 1010, 2023, 2001, 2641, 26932, 1012, 2428, 1010, 1996, 3348, 1998, 16371, 25469, 5019, 2024, 2261, 1998, 2521, 2090, 1010, 2130, 2059, 2009, 1005, 1055, 2025, 2915, 2066, 2070, 10036, 2135, 2081, 22555, 2080, 1012, 2096, 2026, 2406, 3549, 2568, 2424, 2009, 16880, 1010, 1999, 4507, 3348, 1998, 16371, 25469, 2024, 1037, 2350, 18785, 1999, 4467, 5988, 1012, 2130, 13749, 7849, 24544, 1010, 15835, 2037, 3437, 2000, 2204, 2214, 2879, 2198, 4811, 1010, 2018, 3348, 5019, 1999, 2010, 3152, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 1045, 2079, 4012, 3549, 2094, 1996, 16587, 2005, 1996, 2755, 2008, 2151, 3348, 3491, 1999, 1996, 2143, 2003, 3491, 2005, 6018, 5682, 2738, 2084, 2074, 2000, 5213, 2111, 1998, 2191, 2769, 2000, 2022, 3491, 1999, 26932, 12370, 1999, 2637, 1012, 1045, 2572, 8025, 1011, 3756, 2003, 1037, 2204, 2143, 2005, 3087, 5782, 2000, 2817, 1996, 6240, 1998, 14629, 1006, 2053, 26136, 3832, 1007, 1997, 4467, 5988, 1012, 2021, 2428, 1010, 2023, 2143, 2987, 1005, 1056, 2031, 2172, 1997, 1037, 5436, 1012, 102]

2. list[1]
   - Type: list
   - Length: 304
   - Value: [101, 1000, 1045, 2572, 8025, 1024, 3756, 1000, 2003, 1037, 15544, 19307, 1998, 3653, 6528, 20771, 19986, 8632, 1012, 2009, 2987, 1005, 1056, 3043, 2054, 2028, 1005, 1055, 2576, 5328, 2024, 2138, 2023, 2143, 2064, 6684, 2022, 2579, 5667, 2006, 2151, 2504, 1012, 2004, 2005, 1996, 4366, 2008, 19124, 3287, 16371, 25469, 2003, 2019, 6882, 13316, 1011, 2459, 1010, 2008, 3475, 1005, 1056, 2995, 1012, 1045, 1005, 2310, 2464, 1054, 1011, 6758, 3152, 2007, 3287, 16371, 25469, 1012, 4379, 1010, 2027, 2069, 3749, 2070, 25085, 5328, 1010, 2021, 2073, 2024, 1996, 1054, 1011, 6758, 3152, 2007, 21226, 24728, 22144, 2015, 1998, 20916, 4691, 6845, 2401, 1029, 7880, 1010, 2138, 2027, 2123, 1005, 1056, 4839, 1012, 1996, 2168, 3632, 2005, 2216, 10231, 7685, 5830, 3065, 1024, 8040, 7317, 5063, 2015, 11820, 1999, 1996, 9478, 2021, 2025, 1037, 17962, 21239, 1999, 4356, 1012, 1998, 2216, 3653, 6528, 20771, 10271, 5691, 2066, 1996, 2829, 16291, 1010, 1999, 2029, 2057, 1005, 2128, 5845, 2000, 1996, 2609, 1997, 6320, 25624, 1005, 1055, 17061, 3779, 1010, 2021, 2025, 1037, 7637, 1997, 5061, 5710, 2006, 9318, 7367, 5737, 19393, 1012, 2077, 6933, 1006, 2030, 20242, 1007, 1000, 3313, 1011, 3115, 1000, 1999, 5609, 1997, 16371, 25469, 1010, 1996, 10597, 27885, 5809, 2063, 2323, 2202, 2046, 4070, 2028, 14477, 6767, 8524, 6321, 5793, 28141, 4489, 2090, 2273, 1998, 2308, 1024, 2045, 2024, 2053, 8991, 18400, 2015, 2006, 4653, 2043, 19910, 3544, 15287, 1010, 1998, 1996, 2168, 3685, 2022, 2056, 2005, 1037, 2158, 1012, 1999, 2755, 1010, 2017, 3227, 2180, 1005, 1056, 2156, 2931, 8991, 18400, 2015, 1999, 2019, 2137, 2143, 1999, 2505, 2460, 1997, 22555, 2030, 13216, 14253, 2050, 1012, 2023, 6884, 3313, 1011, 3115, 2003, 2625, 1037, 3313, 3115, 2084, 2019, 4914, 2135, 2139, 24128, 3754, 2000, 2272, 2000, 3408, 20547, 2007, 1996, 19008, 1997, 2308, 1005, 1055, 4230, 1012, 102]

3. list[2]
   - Type: list
   - Length: 133
   - Value: [101, 2065, 2069, 2000, 4468, 2437, 2023, 2828, 1997, 2143, 1999, 1996, 2925, 1012, 2023, 2143, 2003, 5875, 2004, 2019, 7551, 2021, 4136, 2053, 2522, 11461, 2466, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 2028, 2453, 2514, 6819, 5339, 8918, 2005, 3564, 27046, 2009, 2138, 2009, 12817, 2006, 2061, 2116, 2590, 3314, 2021, 2009, 2515, 2061, 2302, 2151, 5860, 11795, 3085, 15793, 1012, 1996, 13972, 3310, 2185, 2007, 2053, 2047, 15251, 1006, 4983, 2028, 3310, 2039, 2007, 2028, 2096, 2028, 1005, 1055, 2568, 17677, 2015, 1010, 2004, 2009, 2097, 26597, 2079, 2076, 2023, 23100, 2143, 1007, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 2028, 2453, 2488, 5247, 2028, 1005, 1055, 2051, 4582, 2041, 1037, 3332, 2012, 1037, 3392, 3652, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 102]
mp.show_list(
    tokenizer(d['text'], padding=True)['input_ids']
)
List Overview:
Total items: 3

1. list[0]
   - Type: list
   - Length: 363
   - Value: [101, 1045, 12524, 1045, 2572, 8025, 1011, 3756, 2013, 2026, 2678, 3573, 2138, 1997, 2035, 1996, 6704, 2008, 5129, 2009, 2043, 2009, 2001, 2034, 2207, 1999, 3476, 1012, 1045, 2036, 2657, 2008, 2012, 2034, 2009, 2001, 8243, 2011, 1057, 1012, 1055, 1012, 8205, 2065, 2009, 2412, 2699, 2000, 4607, 2023, 2406, 1010, 3568, 2108, 1037, 5470, 1997, 3152, 2641, 1000, 6801, 1000, 1045, 2428, 2018, 2000, 2156, 2023, 2005, 2870, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 1996, 5436, 2003, 8857, 2105, 1037, 2402, 4467, 3689, 3076, 2315, 14229, 2040, 4122, 2000, 4553, 2673, 2016, 2064, 2055, 2166, 1012, 1999, 3327, 2016, 4122, 2000, 3579, 2014, 3086, 2015, 2000, 2437, 2070, 4066, 1997, 4516, 2006, 2054, 1996, 2779, 25430, 14728, 2245, 2055, 3056, 2576, 3314, 2107, 2004, 1996, 5148, 2162, 1998, 2679, 3314, 1999, 1996, 2142, 2163, 1012, 1999, 2090, 4851, 8801, 1998, 6623, 7939, 4697, 3619, 1997, 8947, 2055, 2037, 10740, 2006, 4331, 1010, 2016, 2038, 3348, 2007, 2014, 3689, 3836, 1010, 19846, 1010, 1998, 2496, 2273, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 2054, 8563, 2033, 2055, 1045, 2572, 8025, 1011, 3756, 2003, 2008, 2871, 2086, 3283, 1010, 2023, 2001, 2641, 26932, 1012, 2428, 1010, 1996, 3348, 1998, 16371, 25469, 5019, 2024, 2261, 1998, 2521, 2090, 1010, 2130, 2059, 2009, 1005, 1055, 2025, 2915, 2066, 2070, 10036, 2135, 2081, 22555, 2080, 1012, 2096, 2026, 2406, 3549, 2568, 2424, 2009, 16880, 1010, 1999, 4507, 3348, 1998, 16371, 25469, 2024, 1037, 2350, 18785, 1999, 4467, 5988, 1012, 2130, 13749, 7849, 24544, 1010, 15835, 2037, 3437, 2000, 2204, 2214, 2879, 2198, 4811, 1010, 2018, 3348, 5019, 1999, 2010, 3152, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 1045, 2079, 4012, 3549, 2094, 1996, 16587, 2005, 1996, 2755, 2008, 2151, 3348, 3491, 1999, 1996, 2143, 2003, 3491, 2005, 6018, 5682, 2738, 2084, 2074, 2000, 5213, 2111, 1998, 2191, 2769, 2000, 2022, 3491, 1999, 26932, 12370, 1999, 2637, 1012, 1045, 2572, 8025, 1011, 3756, 2003, 1037, 2204, 2143, 2005, 3087, 5782, 2000, 2817, 1996, 6240, 1998, 14629, 1006, 2053, 26136, 3832, 1007, 1997, 4467, 5988, 1012, 2021, 2428, 1010, 2023, 2143, 2987, 1005, 1056, 2031, 2172, 1997, 1037, 5436, 1012, 102]

2. list[1]
   - Type: list
   - Length: 363
   - Value: [101, 1000, 1045, 2572, 8025, 1024, 3756, 1000, 2003, 1037, 15544, 19307, 1998, 3653, 6528, 20771, 19986, 8632, 1012, 2009, 2987, 1005, 1056, 3043, 2054, 2028, 1005, 1055, 2576, 5328, 2024, 2138, 2023, 2143, 2064, 6684, 2022, 2579, 5667, 2006, 2151, 2504, 1012, 2004, 2005, 1996, 4366, 2008, 19124, 3287, 16371, 25469, 2003, 2019, 6882, 13316, 1011, 2459, 1010, 2008, 3475, 1005, 1056, 2995, 1012, 1045, 1005, 2310, 2464, 1054, 1011, 6758, 3152, 2007, 3287, 16371, 25469, 1012, 4379, 1010, 2027, 2069, 3749, 2070, 25085, 5328, 1010, 2021, 2073, 2024, 1996, 1054, 1011, 6758, 3152, 2007, 21226, 24728, 22144, 2015, 1998, 20916, 4691, 6845, 2401, 1029, 7880, 1010, 2138, 2027, 2123, 1005, 1056, 4839, 1012, 1996, 2168, 3632, 2005, 2216, 10231, 7685, 5830, 3065, 1024, 8040, 7317, 5063, 2015, 11820, 1999, 1996, 9478, 2021, 2025, 1037, 17962, 21239, 1999, 4356, 1012, 1998, 2216, 3653, 6528, 20771, 10271, 5691, 2066, 1996, 2829, 16291, 1010, 1999, 2029, 2057, 1005, 2128, 5845, 2000, 1996, 2609, 1997, 6320, 25624, 1005, 1055, 17061, 3779, 1010, 2021, 2025, 1037, 7637, 1997, 5061, 5710, 2006, 9318, 7367, 5737, 19393, 1012, 2077, 6933, 1006, 2030, 20242, 1007, 1000, 3313, 1011, 3115, 1000, 1999, 5609, 1997, 16371, 25469, 1010, 1996, 10597, 27885, 5809, 2063, 2323, 2202, 2046, 4070, 2028, 14477, 6767, 8524, 6321, 5793, 28141, 4489, 2090, 2273, 1998, 2308, 1024, 2045, 2024, 2053, 8991, 18400, 2015, 2006, 4653, 2043, 19910, 3544, 15287, 1010, 1998, 1996, 2168, 3685, 2022, 2056, 2005, 1037, 2158, 1012, 1999, 2755, 1010, 2017, 3227, 2180, 1005, 1056, 2156, 2931, 8991, 18400, 2015, 1999, 2019, 2137, 2143, 1999, 2505, 2460, 1997, 22555, 2030, 13216, 14253, 2050, 1012, 2023, 6884, 3313, 1011, 3115, 2003, 2625, 1037, 3313, 3115, 2084, 2019, 4914, 2135, 2139, 24128, 3754, 2000, 2272, 2000, 3408, 20547, 2007, 1996, 19008, 1997, 2308, 1005, 1055, 4230, 1012, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]

3. list[2]
   - Type: list
   - Length: 363
   - Value: [101, 2065, 2069, 2000, 4468, 2437, 2023, 2828, 1997, 2143, 1999, 1996, 2925, 1012, 2023, 2143, 2003, 5875, 2004, 2019, 7551, 2021, 4136, 2053, 2522, 11461, 2466, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 2028, 2453, 2514, 6819, 5339, 8918, 2005, 3564, 27046, 2009, 2138, 2009, 12817, 2006, 2061, 2116, 2590, 3314, 2021, 2009, 2515, 2061, 2302, 2151, 5860, 11795, 3085, 15793, 1012, 1996, 13972, 3310, 2185, 2007, 2053, 2047, 15251, 1006, 4983, 2028, 3310, 2039, 2007, 2028, 2096, 2028, 1005, 1055, 2568, 17677, 2015, 1010, 2004, 2009, 2097, 26597, 2079, 2076, 2023, 23100, 2143, 1007, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 2028, 2453, 2488, 5247, 2028, 1005, 1055, 2051, 4582, 2041, 1037, 3332, 2012, 1037, 3392, 3652, 1012, 1026, 7987, 1013, 1028, 1026, 7987, 1013, 1028, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]

성공

model1(torch.tensor(tokenizer(d['text'], padding=True)['input_ids']))
We strongly recommend passing in an `attention_mask` since your input_ids may be padded. See https://huggingface.co/docs/transformers/troubleshooting#incorrect-output-when-padding-tokens-arent-masked.
SequenceClassifierOutput(loss=None, logits=tensor([[-0.0174,  0.1774],
        [-0.0269,  0.2127],
        [-0.0322,  0.2528]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

(풀이2)

model1(tokenizer(d['text'], padding=True, return_tensors="pt")['input_ids'])
SequenceClassifierOutput(loss=None, logits=tensor([[-0.0174,  0.1774],
        [-0.0269,  0.2127],
        [-0.0322,  0.2528]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

#

# 예제2 – emotion

emotion = datasets.load_dataset('emotion')
d = emotion['train'].select(range(3))
d
Dataset({
    features: ['text', 'label'],
    num_rows: 3
})

(풀이)

model1(torch.tensor(tokenizer(d['text'],padding=True)['input_ids']))
SequenceClassifierOutput(loss=None, logits=tensor([[0.0163, 0.2285],
        [0.0100, 0.2066],
        [0.0087, 0.2619]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

#

# 예제3 – MBTI

# mbti1.csv 파일 다운로드 
!wget https://raw.githubusercontent.com/guebin/MP2024/refs/heads/main/posts/mbti_1.csv
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
    - Avoid using `tokenizers` before the fork if possible
    - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
--2024-11-15 19:18:39--  https://raw.githubusercontent.com/guebin/MP2024/refs/heads/main/posts/mbti_1.csv
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.111.133, 185.199.108.133, 185.199.109.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.111.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 62856486 (60M) [text/plain]
Saving to: ‘mbti_1.csv.1’

mbti_1.csv.1        100%[===================>]  59.94M   107MB/s    in 0.6s    

2024-11-15 19:18:44 (107 MB/s) - ‘mbti_1.csv.1’ saved [62856486/62856486]
d = datasets.Dataset.from_csv("mbti_1.csv").select(range(3))
d
Generating train split: 8675 examples [00:00, 16512.48 examples/s]
Dataset({
    features: ['type', 'posts'],
    num_rows: 3
})

(풀이1)

실패

model1(torch.tensor(tokenizer(d['posts'],padding=True)['input_ids']))
Token indices sequence length is longer than the specified maximum sequence length for this model (2102 > 512). Running this sequence through the model will result in indexing errors
RuntimeError: The size of tensor a (2102) must match the size of tensor b (512) at non-singleton dimension 1

원인분석

mp.show_list(
    tokenizer(d['posts'],padding=True)['input_ids']
)
List Overview:
Total items: 3

1. list[0]
   - Type: list
   - Length: 2102
   - Value: [101, 1005, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1053, 2015, 2595, 16257, 8545, 2509, 21638, 2860, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 4601, 1012, 2865, 1012, 10722, 14905, 20974, 1012, 4012, 1013, 10722, 14905, 20974, 1035, 1048, 14876, 26230, 2692, 2509, 9737, 27717, 19062, 2487, 3217, 9541, 2487, 1035, 3156, 1012, 16545, 2290, 1064, 1064, 1064, 4372, 22540, 1998, 20014, 3501, 5312, 16770, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1045, 2480, 2581, 2571, 2487, 2290, 2549, 2595, 2213, 2549, 2998, 13013, 2121, 2025, 2327, 2702, 3248, 16770, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 15384, 20952, 4371, 2487, 12870, 2278, 26418, 2015, 1064, 1064, 1064, 2054, 2038, 2042, 1996, 2087, 2166, 1011, 5278, 3325, 1999, 2115, 2166, 1029, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1058, 2595, 4371, 2100, 2860, 13088, 2094, 2860, 2620, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1057, 2620, 20518, 3286, 2629, 18927, 2509, 2063, 2006, 9377, 2005, 2087, 1997, 2651, 1012, 1064, 1064, 1064, 2089, 1996, 2566, 2278, 3325, 10047, 16862, 2063, 2017, 1012, 1064, 1064, 1064, 1996, 2197, 2518, 2026, 1999, 2546, 3501, 2767, 6866, 2006, 2010, 9130, 2077, 16873, 5920, 1996, 2279, 2154, 1012, 2717, 1999, 3521, 1066, 8299, 1024, 1013, 1013, 6819, 26247, 1012, 4012, 1013, 22238, 20958, 11387, 2575, 1064, 1064, 1064, 7592, 4372, 2546, 3501, 2581, 1012, 3374, 2000, 2963, 1997, 2115, 12893, 1012, 2009, 1005, 1055, 2069, 3019, 2005, 1037, 3276, 2000, 2025, 2022, 15401, 2035, 1996, 2051, 1999, 2296, 2617, 1997, 4598, 1012, 3046, 2000, 3275, 1996, 2524, 2335, 2004, 2335, 1997, 3930, 1010, 2004, 1012, 1012, 1012, 1064, 1064, 1064, 6391, 22025, 2683, 6391, 23499, 2692, 8299, 1024, 1013, 1013, 2813, 23298, 15194, 3258, 1012, 4012, 1013, 2039, 11066, 1013, 23297, 8889, 1013, 6860, 1011, 2879, 1011, 1998, 1011, 2611, 1011, 2813, 23298, 1012, 16545, 2290, 8299, 1024, 1013, 1013, 7045, 1012, 2079, 19139, 2497, 1012, 4012, 1013, 1059, 2361, 1011, 4180, 1013, 2039, 11066, 2015, 1013, 2230, 1013, 5840, 1013, 2461, 1011, 2188, 1011, 2640, 1012, 16545, 2290, 1012, 1012, 1012, 1064, 1064, 1064, 6160, 1998, 4933, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 2447, 7971, 10127, 1012, 4012, 1013, 1059, 2361, 1011, 4180, 1013, 2039, 11066, 2015, 1013, 2286, 1013, 5511, 1013, 2417, 1011, 2417, 1011, 1996, 1011, 20421, 1011, 3040, 1011, 19652, 16086, 22610, 2549, 1011, 10332, 1011, 27908, 1012, 16545, 2290, 2208, 1012, 2275, 1012, 2674, 1012, 1064, 1064, 1064, 4013, 4143, 2278, 1010, 2092, 19892, 21823, 2078, 1010, 2012, 2560, 4228, 2781, 1997, 3048, 2115, 3456, 1006, 1998, 1045, 2123, 1005, 1056, 2812, 3048, 2068, 2096, 3564, 1999, 2115, 2168, 4624, 3242, 1007, 1010, 17901, 1999, 5549, 8156, 1006, 2672, 3046, 21006, 2015, 2004, 1037, 2740, 3771, 4522, 1012, 1012, 1012, 1064, 1064, 1064, 10468, 2272, 2039, 2007, 2093, 5167, 2017, 1005, 2310, 4340, 2008, 2169, 2828, 1006, 2030, 29221, 4127, 2017, 2215, 2000, 2079, 1007, 2052, 2062, 2084, 3497, 2224, 1010, 2445, 2169, 4127, 1005, 10699, 4972, 1998, 2054, 17048, 1010, 2043, 2187, 2011, 1012, 1012, 1012, 1064, 1064, 1064, 2035, 2477, 1999, 5549, 8156, 1012, 18135, 2003, 5262, 1037, 2678, 2208, 1010, 1998, 1037, 2204, 2028, 2012, 2008, 1012, 3602, 1024, 1037, 2204, 2028, 2012, 2008, 2003, 5399, 20714, 1999, 2008, 1045, 2572, 2025, 3294, 7694, 1996, 2331, 1997, 2151, 2445, 21934, 1012, 1012, 1012, 1064, 1064, 1064, 6203, 4372, 22540, 1024, 2054, 2020, 2115, 5440, 2678, 2399, 3652, 2039, 1998, 2054, 2024, 2115, 2085, 1010, 2783, 5440, 2678, 2399, 1029, 1024, 4658, 1024, 1064, 1064, 1064, 16770, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1053, 22571, 4160, 2102, 2620, 2819, 2480, 8029, 1064, 1064, 1064, 2009, 3544, 2000, 2022, 2205, 2397, 1012, 1024, 6517, 1024, 1064, 1064, 1064, 2045, 1005, 1055, 2619, 2041, 2045, 2005, 3071, 1012, 1064, 1064, 1064, 3524, 1012, 1012, 1012, 1045, 2245, 7023, 2001, 1037, 2204, 2518, 1012, 1064, 1064, 1064, 1045, 2074, 24188, 4509, 1996, 2051, 1997, 22560, 1038, 1013, 1039, 1045, 7065, 2884, 2306, 2026, 5110, 2088, 2062, 6168, 2087, 2060, 2051, 1045, 1005, 1040, 2022, 2147, 2378, 1012, 1012, 1012, 2074, 5959, 1996, 2033, 2051, 2096, 2017, 2064, 1012, 2123, 1005, 1056, 4737, 1010, 2111, 2097, 2467, 2022, 2105, 2000, 1012, 1012, 1012, 1064, 1064, 1064, 10930, 4372, 25856, 6456, 1012, 1012, 1012, 2065, 2017, 1005, 2128, 2046, 1037, 19394, 5649, 6180, 1010, 2092, 1010, 4931, 1012, 1064, 1064, 1064, 1012, 1012, 1012, 2043, 2115, 2364, 2591, 13307, 2003, 12202, 2444, 11450, 1998, 2130, 2059, 2017, 12064, 2135, 16342, 2855, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1043, 16425, 2100, 2581, 4103, 16715, 16932, 1045, 2428, 10667, 1996, 2112, 2013, 1015, 1024, 4805, 2000, 1016, 1024, 2753, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 5796, 4160, 2595, 4246, 5603, 2581, 2497, 2620, 1064, 1064, 1064, 7917, 2138, 2023, 11689, 5942, 2009, 1997, 2033, 1012, 1064, 1064, 1064, 2131, 2152, 1999, 16125, 1010, 25043, 1998, 4521, 9409, 10199, 8261, 2015, 1999, 16125, 2096, 9530, 14028, 2075, 2058, 2242, 7789, 1010, 2628, 2011, 21881, 2015, 1998, 8537, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 12464, 2581, 8780, 2226, 2509, 25526, 4783, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1018, 2615, 2475, 26230, 2953, 2232, 4160, 6559, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 22889, 2615, 24798, 2546, 4160, 4160, 2692, 3775, 1064, 1064, 1064, 7917, 2005, 2205, 2116, 1038, 1005, 1055, 1999, 2008, 6251, 1012, 2129, 2071, 2017, 999, 2228, 1997, 1996, 1038, 999, 1064, 1064, 1064, 7917, 2005, 3666, 5691, 1999, 1996, 3420, 2007, 1996, 24654, 9623, 1012, 1064, 1064, 1064, 7917, 2138, 2740, 2465, 4415, 4036, 2017, 2498, 2055, 8152, 3778, 1012, 1064, 1064, 1064, 7917, 2005, 1037, 2878, 3677, 1997, 4436, 999, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 20868, 26775, 2615, 23632, 25619, 2480, 2549, 1064, 1064, 1064, 1015, 1007, 2048, 3336, 8448, 2006, 2187, 1998, 2157, 14163, 12680, 2075, 2006, 1037, 7813, 1999, 1996, 2690, 1012, 1016, 1007, 2478, 2037, 2219, 2668, 1010, 2048, 5430, 3549, 9708, 2651, 1005, 1055, 6745, 6230, 2015, 2006, 2037, 4351, 5430, 9708, 2813, 1012, 1017, 1007, 1045, 2156, 2009, 2004, 1012, 1012, 1012, 1064, 1064, 1064, 1037, 20421, 2088, 2019, 1999, 2546, 3501, 2554, 3071, 4150, 2019, 23569, 27605, 3367, 1064, 1064, 1064, 4749, 16932, 2475, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1062, 19170, 4160, 1035, 1046, 7959, 16715, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7523, 2863, 3654, 21254, 1012, 4012, 1013, 2262, 1013, 21650, 1011, 15476, 1013, 2322, 1011, 2477, 1011, 2017, 1011, 2134, 2102, 1011, 2113, 1011, 2055, 1011, 28858, 1013, 5532, 1012, 16545, 2290, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 21480, 1012, 16270, 5714, 5620, 1012, 4012, 1013, 2865, 9148, 3211, 1013, 17928, 2015, 1012, 16270, 1012, 4012, 1013, 20421, 1011, 3165, 1011, 2544, 1013, 1040, 1013, 20315, 1013, 4487, 9284, 1012, 21025, 2546, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 14262, 15878, 6137, 1012, 5658, 1013, 8962, 2860, 1011, 1040, 2361, 1013, 16596, 6844, 2099, 1012, 16545, 2290, 1064, 1064, 1064, 2025, 2035, 3324, 2024, 3324, 2138, 2027, 4009, 1012, 2009, 1005, 1055, 1996, 2801, 2008, 9294, 1999, 5716, 2242, 1997, 2115, 2219, 1012, 1012, 1012, 2066, 1037, 8085, 1012, 1064, 1064, 1064, 6160, 2000, 1996, 8957, 6938, 1010, 2711, 2040, 20164, 2026, 2969, 1011, 19593, 12731, 2480, 1045, 1005, 1049, 2025, 2019, 18568, 8085, 3063, 2066, 2841, 1012, 1024, 7098, 1024, 1064, 1064, 1064, 7917, 2005, 2635, 2035, 1996, 2282, 2104, 2026, 2793, 1012, 8038, 10657, 4553, 2000, 3745, 2007, 1996, 20997, 2229, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1059, 2620, 8004, 5714, 2078, 28311, 20784, 1064, 1064, 1064, 7917, 2005, 2108, 2205, 2172, 1997, 1037, 8505, 2075, 1010, 24665, 25438, 2989, 2785, 1997, 4040, 1012, 1012, 1012, 15624, 1012, 1064, 1064, 1064, 6289, 2232, 1012, 1012, 1012, 2214, 2152, 2082, 2189, 1045, 4033, 1005, 1056, 2657, 1999, 5535, 1012, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 5887, 26775, 6279, 19797, 2497, 2487, 2860, 1064, 1064, 1064, 1045, 3478, 1037, 2270, 4092, 2465, 1037, 2261, 2086, 3283, 1998, 1045, 1005, 2310, 4066, 1997, 4342, 2054, 1045, 2071, 2079, 2488, 2020, 1045, 2000, 2022, 1999, 2008, 2597, 2153, 1012, 1037, 2502, 2112, 1997, 2026, 4945, 2001, 2074, 2058, 18570, 2870, 2007, 2205, 1012, 1012, 1012, 1064, 1064, 1064, 1045, 2066, 2023, 2711, 1005, 1055, 5177, 3012, 1012, 2002, 1005, 1055, 1037, 4484, 20014, 3501, 2011, 1996, 2126, 1012, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1044, 2290, 2243, 3669, 1011, 16216, 2278, 2575, 2213, 1064, 1064, 1064, 2693, 2000, 1996, 7573, 2181, 1998, 2707, 1037, 2047, 2166, 2005, 2870, 1012, 1005, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]

2. list[1]
   - Type: list
   - Length: 2102
   - Value: [101, 1005, 1045, 1005, 1049, 4531, 1996, 3768, 1997, 2033, 1999, 2122, 8466, 2200, 8598, 2075, 1012, 1064, 1064, 1064, 3348, 2064, 2022, 11771, 2065, 2009, 1005, 1055, 1999, 1996, 2168, 2597, 2411, 1012, 2005, 2742, 2033, 1998, 2026, 6513, 2024, 2747, 1999, 2019, 4044, 2073, 2057, 2031, 2000, 5541, 2135, 2224, 11190, 15239, 1998, 8696, 1012, 2045, 3475, 1005, 1056, 2438, 1012, 1012, 1012, 1064, 1064, 1064, 3228, 2047, 3574, 2000, 1005, 2208, 1005, 3399, 1012, 1064, 1064, 1064, 7592, 1008, 4372, 25856, 5861, 1008, 2008, 1005, 1055, 2035, 2009, 3138, 1012, 2084, 2057, 23705, 1998, 2027, 2079, 2087, 1997, 1996, 20661, 2096, 1045, 13399, 2037, 3739, 1998, 2709, 2037, 2616, 2007, 5744, 2773, 13068, 1998, 2062, 5048, 2100, 20237, 1012, 1064, 1064, 1064, 2023, 1009, 3768, 1997, 5703, 1998, 2192, 3239, 12016, 1012, 1064, 1064, 1064, 2613, 26264, 3231, 1045, 3556, 13029, 1012, 4274, 26264, 5852, 2024, 6057, 1012, 1045, 3556, 8574, 2015, 2030, 3020, 1012, 2085, 1010, 2066, 1996, 2280, 10960, 1997, 2023, 11689, 1045, 2097, 5254, 2008, 1045, 2123, 1005, 1056, 2903, 1999, 1996, 26264, 3231, 1012, 2077, 2017, 7221, 4509, 1012, 1012, 1012, 1064, 1064, 1064, 2017, 2113, 2017, 1005, 2128, 2019, 4372, 25856, 2043, 2017, 25887, 2013, 1037, 2609, 2005, 1037, 2095, 1998, 1037, 2431, 1010, 2709, 1010, 1998, 2424, 2111, 2024, 2145, 15591, 2006, 2115, 8466, 1998, 16663, 2115, 4784, 1013, 4301, 1012, 2017, 2113, 2017, 1005, 2128, 2019, 4372, 25856, 2043, 2017, 1012, 1012, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 10047, 2290, 15136, 2620, 1012, 4871, 3270, 3600, 1012, 2149, 1013, 10047, 2290, 15136, 2620, 1013, 4185, 19317, 1013, 3438, 11387, 2094, 2487, 2546, 2683, 2850, 2575, 2683, 22932, 2050, 2575, 2497, 2581, 2487, 19473, 2575, 1012, 16545, 2290, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 10047, 2290, 1012, 4639, 2094, 16872, 28014, 1012, 4012, 1013, 6282, 2509, 2050, 2692, 2278, 2575, 18827, 22025, 16932, 3540, 2497, 2620, 2549, 2278, 22203, 1064, 1064, 1064, 1045, 2058, 2228, 2477, 2823, 1012, 1045, 2175, 2011, 1996, 2214, 20052, 9106, 14686, 1012, 3383, 1010, 2043, 1037, 2158, 2038, 2569, 3716, 1998, 2569, 4204, 2066, 2026, 2219, 1010, 2009, 2738, 16171, 2032, 2000, 6148, 1037, 3375, 1012, 1012, 1012, 1064, 1064, 1064, 13789, 12155, 10270, 1012, 10722, 14905, 20974, 1012, 4012, 2061, 2003, 1045, 1024, 1040, 1064, 1064, 1064, 4278, 1010, 2199, 1009, 2695, 1064, 1064, 1064, 2025, 2428, 1025, 1045, 1005, 2310, 2196, 2245, 1997, 1041, 1013, 1045, 2030, 1046, 1013, 1052, 2004, 2613, 4972, 1012, 1045, 3648, 2870, 2006, 2054, 1045, 2224, 1012, 1045, 2224, 11265, 1998, 14841, 2004, 2026, 29532, 1012, 10768, 2005, 6699, 1998, 6524, 9033, 1012, 1045, 2036, 2224, 9152, 2349, 2000, 2033, 3997, 1012, 1012, 1012, 1064, 1064, 1064, 2017, 2113, 2295, 1012, 2008, 2001, 13749, 18595, 3560, 1012, 2044, 3038, 2009, 1045, 2428, 2215, 2000, 3046, 2009, 1998, 2156, 2054, 6433, 2007, 2033, 2652, 1037, 2034, 2711, 13108, 1999, 1996, 2067, 2096, 2057, 3298, 2105, 1012, 1045, 2215, 2000, 2156, 1996, 2298, 2006, 1012, 1012, 1012, 1064, 1064, 1064, 2041, 1997, 2035, 1997, 2068, 1996, 2600, 3259, 2028, 2003, 1996, 2190, 1012, 2009, 3084, 2033, 8840, 2140, 1012, 2017, 4364, 2024, 5341, 1024, 1040, 1045, 1005, 1049, 2428, 2152, 2039, 2006, 1996, 10722, 14905, 20974, 2291, 1012, 1064, 1064, 1064, 2061, 2106, 2017, 2963, 2055, 2008, 2047, 2034, 2711, 13108, 2208, 1029, 1045, 1005, 2310, 2042, 14934, 1996, 3109, 2041, 1997, 1996, 6050, 2006, 2026, 8285, 2614, 3941, 2008, 2097, 6073, 1996, 17223, 1012, 2057, 3266, 2000, 2404, 1037, 3232, 8827, 2509, 1005, 1055, 1999, 1012, 1012, 1012, 1064, 1064, 1064, 2053, 1025, 1996, 2126, 2002, 4198, 2477, 2001, 2200, 11265, 1012, 11265, 29532, 2024, 2074, 2004, 5204, 1997, 2037, 10058, 2004, 7367, 29532, 1012, 2742, 1024, 13218, 7084, 2030, 4754, 4869, 1025, 2119, 4372, 25856, 2015, 1012, 1064, 1064, 1064, 2092, 4918, 1045, 2097, 2022, 1996, 2034, 2000, 6449, 1045, 2079, 2131, 9981, 2066, 2017, 2079, 1012, 1045, 16833, 2009, 2039, 2000, 2026, 1018, 2860, 2509, 2540, 3816, 2007, 2026, 16083, 1021, 2860, 2620, 1012, 1021, 2015, 1998, 1022, 2015, 2119, 2066, 2000, 2022, 4384, 1012, 1018, 1005, 1055, 2066, 2000, 2022, 2124, 1006, 2025, 1996, 2168, 1012, 1012, 1012, 1064, 1064, 1064, 1025, 1040, 1045, 1005, 2222, 2039, 11066, 1996, 2168, 12528, 2007, 1996, 23025, 2185, 2013, 2026, 2677, 1012, 2084, 2017, 2180, 1005, 1056, 2963, 2505, 1012, 14104, 12025, 2806, 2021, 2007, 11867, 20051, 3334, 1012, 1064, 1064, 1064, 14841, 2243, 2000, 2243, 2003, 1037, 2428, 2307, 2299, 1012, 2004, 2146, 2004, 2017, 2064, 5177, 3796, 2041, 1996, 3220, 1012, 1045, 2293, 1996, 3786, 2009, 3084, 2033, 17523, 1012, 1064, 1064, 1064, 4530, 1012, 22834, 1058, 2487, 26760, 3600, 2692, 1024, 1040, 23025, 2428, 2485, 2000, 2026, 2677, 1998, 15488, 23212, 2078, 20232, 1024, 18364, 3608, 2652, 1999, 1996, 4281, 1012, 1064, 1064, 1064, 27084, 19210, 1027, 1013, 1027, 4654, 13181, 16874, 1025, 1045, 1005, 1049, 2019, 4654, 13181, 16874, 1998, 1045, 1005, 1049, 2025, 27084, 19210, 1012, 1024, 1007, 1064, 1064, 1064, 20052, 1999, 1996, 3185, 2001, 2019, 4372, 25856, 1012, 5373, 2002, 1005, 1055, 2209, 2004, 1037, 4654, 2102, 3501, 1012, 1999, 1996, 2808, 2002, 1005, 1055, 2019, 9765, 3501, 1012, 2004, 1045, 2056, 1012, 1996, 3185, 2246, 2204, 3272, 2005, 2009, 2108, 2170, 20052, 9106, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 1045, 2620, 16576, 1012, 6302, 24204, 3388, 1012, 4012, 1013, 4042, 1013, 1062, 2480, 2683, 2575, 1013, 27829, 3695, 2080, 1013, 6530, 10105, 2818, 1012, 1052, 3070, 1064, 1064, 1064, 2821, 1010, 1045, 2196, 2018, 3571, 1997, 7618, 1037, 3124, 1012, 1045, 2097, 3610, 2019, 4111, 2205, 1012, 2061, 2045, 2001, 2498, 2000, 25887, 1012, 2074, 3167, 5510, 1998, 2033, 2025, 16663, 2009, 1012, 1996, 3124, 1045, 4782, 2134, 1005, 1056, 2113, 2033, 1012, 2009, 2001, 2028, 1997, 2216, 1012, 1012, 1012, 1064, 1064, 1064, 4165, 3492, 2172, 2066, 2026, 2181, 1998, 2054, 1045, 1005, 1049, 2183, 2083, 2157, 2085, 2667, 2000, 3275, 2041, 2029, 2126, 1045, 2215, 2000, 2202, 2026, 2166, 1012, 1045, 2215, 2000, 2079, 2061, 2116, 2477, 1012, 1996, 5221, 3291, 2003, 2008, 1045, 2113, 2065, 1045, 2123, 1005, 1056, 1012, 1012, 1012, 1064, 1064, 1064, 1025, 1040, 1045, 2001, 4082, 2104, 1996, 8605, 2008, 2017, 2020, 2931, 1012, 1045, 2196, 2246, 2012, 2115, 3482, 2100, 1012, 3100, 1010, 1045, 2393, 2041, 2026, 5637, 2814, 2035, 1996, 2051, 1998, 2028, 1997, 2068, 2038, 2764, 1037, 2210, 10188, 2006, 2033, 1012, 1045, 2131, 2417, 1012, 1012, 1012, 1064, 1064, 1064, 1056, 1035, 1056, 2017, 2074, 2649, 2033, 1998, 1045, 1005, 1049, 2542, 1996, 5409, 10103, 1012, 1045, 1005, 1049, 7567, 1999, 2028, 2173, 2007, 2028, 2028, 2105, 1012, 2069, 10634, 5249, 1012, 2065, 1045, 2001, 1037, 7642, 6359, 2023, 2052, 2022, 1996, 3819, 2173, 2021, 13718, 1045, 1005, 1049, 1012, 1012, 1012, 1064, 1064, 1064, 26419, 2232, 1010, 1998, 25352, 1010, 4165, 2066, 1037, 25843, 1999, 22540, 1012, 1045, 2228, 2672, 2002, 2001, 3480, 1998, 2357, 9765, 3501, 1012, 1045, 2064, 2425, 2138, 2002, 2038, 2070, 1997, 1996, 5171, 1999, 22540, 12955, 2187, 2058, 1012, 1064, 1064, 1064, 1008, 14148, 2862, 1008, 1045, 1005, 1049, 3374, 1012, 2009, 3849, 2008, 2017, 2031, 2234, 2012, 1037, 2919, 2051, 1012, 2057, 1005, 2310, 2525, 2584, 2256, 20563, 1997, 1999, 2546, 22578, 1012, 2174, 1010, 2108, 2017, 1005, 2128, 2931, 1998, 1045, 2066, 3801, 1045, 2097, 2191, 2017, 1037, 3066, 1012, 1045, 2097, 5926, 2028, 1012, 1012, 1012, 1064, 1064, 1064, 1045, 1005, 1049, 14405, 2361, 1006, 6729, 2646, 1041, 1007, 1012, 1045, 1005, 1049, 3733, 2005, 2119, 4372, 25856, 2015, 1998, 20014, 4523, 2000, 6709, 2007, 1012, 1024, 1007, 1064, 1064, 1064, 1045, 2036, 5674, 4372, 25856, 1005, 1055, 16871, 2015, 2052, 2175, 1037, 2210, 2978, 2066, 2990, 1005, 1055, 2013, 2484, 3272, 2062, 6228, 1012, 19838, 4726, 2039, 5213, 3949, 3941, 1999, 2019, 4704, 2311, 2041, 1997, 2019, 2214, 2482, 7151, 3723, 1010, 21097, 1012, 1012, 1012, 1064, 1064, 1064, 2009, 2001, 1037, 19394, 1024, 1007, 3404, 2033, 1012, 1045, 1005, 1049, 2074, 2004, 18224, 25940, 1024, 1040, 3272, 1045, 2031, 7861, 20214, 5644, 1012, 2027, 1005, 2128, 2074, 6881, 3924, 1012, 2066, 5870, 2043, 1045, 2131, 3480, 2030, 2012, 2111, 2770, 3209, 2058, 2007, 2037, 10168, 9587, 13777, 1012, 1012, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 1045, 2620, 16576, 1012, 6302, 24204, 3388, 1012, 4012, 1013, 4042, 1013, 1062, 2480, 2683, 2575, 1013, 27829, 3695, 2080, 1013, 8505, 19718, 1012, 1052, 3070, 11039, 25856, 1024, 1013, 1013, 1045, 2620, 16576, 1012, 6302, 24204, 3388, 1012, 4012, 1013, 4042, 1013, 1062, 2480, 2683, 2575, 1013, 27829, 3695, 2080, 1013, 8505, 19718, 2497, 2860, 1012, 1052, 3070, 8299, 1024, 1013, 1013, 1045, 2620, 16576, 1012, 6302, 24204, 3388, 1012, 4012, 1013, 4042, 1013, 1062, 2480, 2683, 2575, 1013, 27829, 3695, 2080, 1013, 14448, 19718, 1012, 1052, 3070, 1064, 1064, 1064, 2053, 1012, 2009, 1005, 1055, 2066, 1037, 4323, 2005, 2073, 1045, 2444, 1998, 2008, 2003, 2339, 1045, 2113, 2009, 2011, 2540, 1012, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1046, 2629, 2860, 2581, 2509, 3270, 2615, 4160, 2497, 2290, 1064, 1064, 1064, 1998, 1045, 5156, 2123, 1005, 1056, 2681, 2127, 1996, 2518, 4515, 1012, 2021, 1999, 1996, 2812, 2051, 1012, 1999, 2090, 2335, 1012, 2017, 2147, 2115, 2518, 1012, 1045, 1005, 2222, 2147, 3067, 1024, 1040, 1025, 1040, 1045, 1005, 1049, 1996, 16914, 2361, 1025, 5165, 2000, 3113, 2017, 1012, 1064, 1064, 1064, 4365, 1010, 2342, 2000, 3404, 2026, 16160, 2062, 1045, 2052, 2031, 2042, 3553, 1045, 2001, 2183, 2000, 2360, 1999, 22540, 1012, 1064, 1064, 1064, 4654, 22540, 1029, 6729, 2646, 1055, 2007, 1996, 2126, 2016, 5838, 1012, 1024, 1040, 2026, 2814, 1010, 2130, 2026, 5637, 1998, 11690, 3924, 1010, 2467, 2272, 2000, 2033, 2005, 6040, 1012, 1064, 1064, 1064, 1045, 6812, 2000, 2026, 4372, 25856, 5972, 4372, 25856, 2015, 2024, 2061, 2307, 1012, 2065, 2009, 2347, 1005, 1056, 2005, 4372, 25856, 2015, 1045, 2876, 1005, 1056, 2031, 2042, 2583, 2000, 3857, 2054, 1045, 1005, 1049, 2311, 9457, 9457, 9457, 13305, 1064, 1064, 1064, 2054, 1029, 2033, 1029, 1045, 2196, 2079, 2008, 1028, 1012, 1028, 1026, 1012, 1026, 1064, 1064, 1064, 2138, 2049, 2524, 2000, 2022, 6517, 2055, 3974, 2619, 2017, 2066, 2043, 2017, 2354, 2017, 2020, 2157, 1998, 2507, 4426, 1037, 2502, 6986, 2006, 1996, 2067, 2138, 2017, 1005, 2128, 12476, 1998, 2467, 6149, 1012, 1064, 1064, 1064, 2821, 1010, 2017, 2123, 1005, 1056, 2031, 2000, 2425, 2033, 2008, 2087, 1997, 2068, 2024, 5236, 1012, 1045, 2113, 2023, 1012, 2008, 2003, 2339, 1045, 2377, 2007, 2068, 1998, 2009, 3084, 2033, 4756, 1012, 1024, 1040, 2004, 1045, 1005, 1049, 2183, 2000, 2202, 11265, 10976, 18075, 9905, 6483, 1998, 1045, 2031, 1037, 2261, 15034, 1012, 1012, 1012, 1064, 1064, 1064, 1024, 1040, 1045, 1005, 1049, 1037, 2305, 5004, 2140, 1012, 1045, 5256, 2039, 2090, 1020, 1011, 1021, 9737, 1998, 2994, 8300, 6229, 2184, 1011, 2340, 1024, 2382, 3286, 1012, 1064, 1064, 1064, 3167, 5448, 6153, 2011, 3399, 2052, 6592, 2008, 20014, 4523, 2024, 1996, 2087, 14286, 3697, 1012, 2096, 20014, 22578, 2064, 2022, 14286, 24436, 2021, 2027, 2097, 2036, 2224, 2591, 8146, 2065, 1996, 1996, 2342, 18653, 1012, 1012, 1012, 1012, 1064, 1064, 1064, 3167, 15768, 2008, 1045, 2031, 2006, 2026, 15363, 2008, 1045, 1005, 2310, 22817, 2013, 6721, 4518, 4573, 1998, 4518, 6302, 24204, 8454, 1012, 1064, 1064, 1064, 1045, 1005, 2222, 2425, 2017, 2043, 1045, 2330, 7760, 18471, 1012, 1024, 1007, 5580, 2017, 2066, 2009, 10763, 1012, 1064, 1064, 1064, 1024, 1040, 4283, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 1045, 2620, 16576, 1012, 6302, 24204, 3388, 1012, 4012, 1013, 4042, 1013, 1062, 2480, 2683, 2575, 1013, 27829, 3695, 2080, 1013, 2331, 16523, 11514, 1012, 1052, 3070, 8299, 1024, 1013, 1013, 1045, 2620, 16576, 1012, 6302, 24204, 3388, 1012, 4012, 1013, 4042, 1013, 1062, 2480, 2683, 2575, 1013, 27829, 3695, 2080, 1013, 2331, 16523, 11514, 2497, 2860, 1012, 1052, 3070, 2081, 2005, 1037, 2767, 1012, 2195, 2847, 1997, 2147, 1012, 1045, 3833, 2296, 2240, 2011, 1012, 1012, 1012, 1064, 1064, 1064, 1024, 1007, 10763, 1024, 8299, 1024, 1013, 1013, 1045, 2620, 16576, 1012, 6302, 24204, 3388, 1012, 4012, 1013, 4042, 1013, 1062, 2480, 2683, 2575, 1013, 27829, 3695, 2080, 1013, 10763, 23615, 6528, 1012, 1052, 3070, 1045, 1005, 2222, 2031, 2000, 2131, 2000, 2115, 22128, 2101, 2065, 2028, 1997, 2026, 3507, 13220, 2987, 1005, 1056, 1012, 1064, 1064, 1064, 15034, 2123, 1005, 1056, 2562, 2033, 2105, 2146, 2438, 2000, 11616, 2033, 1012, 1045, 2066, 2000, 9121, 2007, 2068, 1012, 2054, 1045, 2031, 11616, 2870, 2007, 1998, 2018, 1037, 2261, 15034, 2814, 1006, 1009, 1037, 2261, 2060, 2814, 1007, 2425, 2033, 1045, 2031, 2003, 1012, 1012, 1012, 1005, 102]

3. list[2]
   - Type: list
   - Length: 2102
   - Value: [101, 1005, 2204, 2028, 1035, 1035, 1035, 1035, 1035, 16770, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1042, 4048, 18259, 28027, 2546, 2290, 2860, 1064, 1064, 1064, 1997, 2607, 1010, 2000, 2029, 1045, 2360, 1045, 2113, 1025, 2008, 1005, 1055, 2026, 13301, 1998, 2026, 8364, 1012, 1064, 1064, 1064, 2515, 2108, 7078, 3893, 2008, 2017, 1998, 2115, 2190, 2767, 2071, 2022, 2019, 6429, 3232, 4175, 1029, 2065, 2061, 1010, 2084, 2748, 1012, 2030, 2009, 1005, 1055, 2062, 1045, 2071, 2022, 29179, 1999, 2293, 1999, 2553, 1045, 28348, 2026, 5346, 1006, 2029, 2012, 1012, 1012, 1012, 1064, 1064, 1064, 2053, 1010, 1045, 2134, 1005, 1056, 1025, 4067, 2017, 2005, 1037, 4957, 999, 1064, 1064, 1064, 2061, 1011, 2170, 14841, 1011, 9033, 7077, 1006, 1998, 2009, 2064, 7872, 2013, 2151, 2783, 8476, 1013, 17418, 1007, 2064, 2022, 9252, 1012, 2009, 1005, 1055, 2066, 2043, 2017, 1005, 2128, 5881, 1999, 2115, 2219, 4301, 1010, 1998, 2115, 2568, 2074, 17677, 2015, 1999, 7925, 1012, 5683, 5621, 6659, 1012, 1012, 1012, 1012, 1064, 1064, 1064, 2031, 2017, 4384, 2129, 14099, 10072, 2064, 2022, 1029, 2035, 2017, 2031, 2000, 2079, 2003, 2298, 2091, 2012, 1996, 5568, 1024, 9877, 1997, 2367, 3269, 2427, 2045, 1012, 1998, 2085, 5674, 2008, 5606, 1997, 2086, 2101, 1006, 2043, 1013, 2065, 5800, 1012, 1012, 1012, 1064, 1064, 1064, 1996, 3044, 2015, 1516, 2196, 2018, 2053, 2028, 2412, 1064, 1064, 1064, 1045, 2411, 2424, 2870, 27963, 5344, 2006, 7720, 13262, 1013, 3536, 1012, 1064, 1064, 1064, 2023, 1019, 2095, 1011, 2214, 6251, 2003, 2019, 11757, 8321, 1998, 3376, 6412, 1012, 1064, 1064, 1064, 1045, 4033, 1005, 1056, 4716, 2023, 4037, 1999, 1996, 2197, 1017, 2086, 1012, 2061, 9444, 9631, 2023, 1006, 1998, 2672, 2130, 17749, 2033, 1010, 2029, 1045, 3811, 4797, 1007, 1024, 7632, 1012, 6352, 2692, 26224, 6352, 2692, 28311, 1064, 1064, 1064, 2043, 2017, 4133, 1999, 2115, 3871, 2127, 2184, 1024, 2382, 7610, 3015, 2774, 1010, 1998, 6170, 2068, 1006, 2362, 2007, 9877, 1997, 4533, 2015, 1007, 2096, 2652, 2115, 6490, 2858, 1012, 1064, 1064, 1064, 2023, 2003, 1996, 2087, 20014, 2361, 1011, 2003, 2232, 11689, 1045, 1005, 2310, 2412, 2464, 1012, 1064, 1064, 1064, 1045, 2876, 1005, 1056, 2022, 2583, 2000, 2298, 2012, 1996, 4169, 2005, 1996, 2972, 2166, 2065, 1045, 2354, 2008, 1045, 3856, 2009, 2058, 1996, 2529, 2108, 1012, 1064, 1064, 1064, 1045, 2001, 5059, 1037, 4281, 2005, 2026, 7284, 2006, 2029, 1045, 1005, 1049, 2551, 2157, 2085, 1011, 2009, 2323, 2031, 2042, 7733, 1012, 1012, 2021, 1045, 2371, 27885, 14715, 3064, 2000, 2191, 2928, 28194, 5420, 2595, 2683, 2475, 2015, 2695, 11522, 2013, 2009, 1024, 1040, 2065, 2017, 3191, 1996, 2338, 1012, 1012, 1012, 1064, 1064, 1064, 1045, 2318, 2000, 2191, 5888, 2055, 13170, 5146, 1998, 21830, 9610, 7834, 1011, 2182, 2017, 2064, 2156, 2048, 2034, 3441, 1024, 16770, 1024, 1013, 1013, 7479, 1012, 10722, 14905, 20974, 1012, 4012, 1013, 9927, 1013, 1011, 4074, 20348, 29159, 1011, 1064, 1064, 1064, 20014, 3501, 3728, 1045, 2318, 2000, 2695, 2026, 5888, 2055, 2048, 2814, 1011, 13170, 5146, 1998, 21830, 9610, 7834, 1012, 2077, 2008, 1010, 1045, 2074, 6866, 4933, 2008, 4699, 2033, 1010, 2021, 2013, 2085, 2006, 1045, 1005, 2222, 3046, 2000, 2421, 2069, 2026, 2573, 1012, 1012, 1012, 1064, 1064, 1064, 2763, 2057, 2071, 2147, 2362, 2006, 1037, 2047, 2944, 1011, 1045, 1005, 1049, 2019, 6739, 1999, 18772, 18217, 1997, 7239, 2588, 2536, 6881, 4933, 1012, 2008, 6433, 2138, 1997, 14099, 3168, 1997, 8562, 1011, 2061, 14099, 2008, 2025, 2172, 1012, 1012, 1012, 1064, 1064, 1064, 7592, 9541, 20976, 1010, 2017, 2064, 3543, 2009, 1012, 3071, 6732, 2008, 2009, 1005, 1055, 6015, 2030, 6517, 1010, 2021, 2008, 1005, 1055, 2025, 2995, 1011, 1999, 2755, 2009, 2038, 2019, 7078, 8699, 2227, 1012, 1998, 2023, 18401, 2941, 2428, 7777, 26085, 1998, 24459, 1006, 2069, 1012, 1012, 1012, 1064, 1064, 1064, 2092, 1012, 1012, 2785, 1997, 1025, 2004, 2009, 2001, 2525, 3855, 1010, 2823, 2138, 1997, 9152, 2009, 1005, 1055, 2524, 2000, 16636, 3375, 4933, 2029, 16949, 2039, 1999, 2115, 2132, 1999, 1059, 14341, 19570, 2389, 22754, 1997, 10466, 1998, 4620, 2069, 2007, 2616, 1012, 1012, 1012, 1012, 1064, 1064, 1064, 1045, 2228, 2023, 18401, 2052, 2022, 2200, 6413, 2182, 1012, 4261, 26187, 2575, 2475, 1064, 1064, 1064, 4029, 19841, 22022, 1064, 1064, 1064, 2204, 2305, 3071, 2041, 2045, 999, 2130, 2065, 2005, 2619, 2045, 2003, 2851, 2157, 2085, 1011, 6385, 2467, 3565, 6924, 2063, 16956, 1012, 1012, 1998, 2111, 2360, 2204, 2305, 1999, 2344, 2000, 3113, 2279, 2154, 1024, 1007, 1064, 1064, 1064, 2821, 1010, 2008, 3185, 1024, 1007, 2009, 1005, 1055, 12476, 4067, 2017, 999, 3246, 2017, 2018, 2204, 3637, 1999, 1996, 2250, 1025, 4312, 1010, 1045, 1005, 1049, 10261, 2017, 2204, 2305, 2005, 1996, 2279, 2305, 3805, 999, 1006, 11504, 2009, 2097, 2022, 2006, 2455, 1007, 2204, 2111, 10107, 2204, 1012, 1012, 1012, 1064, 1064, 1064, 3486, 2620, 2620, 2620, 2475, 3486, 2620, 2620, 21057, 1064, 1064, 1064, 2092, 1010, 2060, 2111, 2040, 2089, 2022, 6603, 2055, 2019, 3277, 2013, 1996, 2171, 1997, 1996, 8476, 2097, 2424, 2115, 3433, 14044, 4312, 1024, 1007, 1064, 1064, 1064, 2023, 1012, 2633, 2619, 3855, 2008, 1024, 1007, 1064, 1064, 1064, 1045, 2145, 2156, 7329, 1013, 5344, 1999, 1037, 15079, 1997, 2536, 6721, 7060, 1012, 2009, 2064, 2022, 19142, 2823, 1012, 2009, 1005, 1055, 1037, 2200, 18801, 8066, 2043, 2017, 1005, 2128, 11471, 1012, 1064, 1064, 1064, 2821, 1010, 1045, 2134, 1005, 1056, 2113, 2008, 1012, 1012, 2054, 1037, 12063, 1012, 2339, 2025, 8688, 2878, 17006, 1010, 2059, 1029, 2057, 2064, 5630, 2029, 24547, 22345, 2097, 2022, 1996, 2190, 1006, 1045, 2228, 1996, 5221, 2028, 2052, 2022, 2307, 1007, 1012, 1064, 1064, 1064, 12316, 27659, 2182, 2017, 2175, 26231, 8889, 2475, 2002, 6732, 2008, 1996, 2543, 2003, 12090, 1012, 2323, 1045, 8688, 2000, 11263, 1029, 1045, 2123, 1005, 1056, 2066, 2000, 5949, 2833, 1012, 1064, 1064, 1064, 1045, 2123, 1005, 1056, 2228, 2008, 1996, 8543, 1997, 2023, 11689, 14977, 2054, 1005, 1055, 2183, 2006, 2182, 2044, 1017, 2086, 1024, 1007, 1064, 1064, 1064, 2002, 2232, 1010, 1045, 3305, 2017, 1024, 1007, 2007, 2122, 2168, 2445, 4155, 1007, 1007, 1007, 1064, 1064, 1064, 2748, 4757, 2015, 1010, 6172, 2051, 1024, 1040, 1064, 1064, 1064, 1045, 2131, 4854, 3243, 6524, 1010, 2021, 2043, 1045, 2079, 1010, 2009, 1005, 1055, 13726, 2005, 4193, 2111, 2000, 2175, 4873, 2842, 1012, 2009, 1005, 1055, 5263, 2005, 2033, 2000, 5342, 2030, 16081, 4963, 1025, 1996, 2069, 2126, 2000, 2131, 9436, 1997, 2023, 3110, 2003, 2000, 6532, 1012, 1012, 1012, 1064, 1064, 1064, 1045, 1005, 2310, 2196, 4669, 2009, 2505, 8275, 2003, 2919, 1010, 2941, 1012, 1064, 1064, 1064, 24459, 2323, 2022, 2445, 2069, 2000, 4217, 3924, 1012, 4217, 1012, 2045, 2024, 3243, 2261, 1997, 2068, 1010, 2295, 1012, 1064, 1064, 1064, 4090, 2683, 2620, 21057, 1064, 1064, 1064, 9805, 2361, 1010, 2017, 1005, 2128, 2725, 2009, 2157, 1024, 1007, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 1011, 4074, 20348, 29159, 1011, 1012, 10722, 14905, 20974, 1012, 4012, 1013, 1064, 1064, 1064, 17273, 2620, 15136, 1064, 1064, 1064, 1997, 2607, 2009, 1005, 1055, 2025, 2200, 6625, 1012, 2021, 1012, 2529, 2679, 5175, 16047, 2308, 1005, 1055, 3754, 2000, 2507, 4182, 2000, 2060, 2529, 9552, 1012, 2009, 2499, 2005, 5190, 1997, 2086, 1012, 2339, 2689, 2009, 1029, 4661, 1010, 2045, 2024, 1012, 1012, 1012, 1064, 1064, 1064, 2008, 6433, 1012, 1998, 2009, 5158, 2138, 2087, 2411, 2111, 2224, 3463, 1997, 5186, 10480, 1998, 9603, 3784, 5852, 2004, 1037, 3978, 1997, 12515, 2028, 1005, 1055, 2828, 1012, 2119, 5107, 1998, 2653, 2840, 1006, 2062, 1012, 1012, 1012, 1064, 1064, 1064, 22376, 22025, 2575, 1064, 1064, 1064, 1045, 2817, 8425, 2640, 2085, 1010, 2029, 1045, 2428, 5959, 1012, 2054, 2003, 5875, 2055, 2023, 2492, 1010, 2003, 2008, 1996, 3754, 2000, 9699, 4784, 1998, 9611, 3471, 2003, 2172, 2062, 2590, 2084, 6664, 1997, 1037, 3563, 1012, 1012, 1012, 1064, 1064, 1064, 4074, 20348, 29159, 2683, 2581, 1011, 14386, 26802, 5339, 1064, 1064, 1064, 23593, 2683, 2683, 2549, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1016, 8950, 2278, 2078, 2620, 2213, 2683, 2213, 2692, 2213, 1064, 1064, 1064, 1045, 2572, 2467, 3201, 2000, 9009, 1006, 2000, 20014, 27605, 13701, 1010, 2000, 2022, 10480, 1007, 2026, 22941, 1005, 1055, 25042, 1012, 1064, 1064, 1064, 2088, 17882, 1029, 5008, 2111, 1999, 1996, 2132, 1029, 2339, 1029, 2821, 1010, 2157, 1010, 20014, 22578, 2467, 2442, 2022, 17253, 2069, 2007, 2122, 2616, 1012, 1045, 2215, 2000, 2265, 2061, 6649, 2026, 4668, 2000, 2023, 1024, 19594, 19317, 2575, 1064, 1064, 1064, 20741, 10790, 2575, 1064, 1064, 1064, 21541, 2361, 1029, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1021, 5603, 4160, 6977, 2595, 2863, 5657, 1005, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
mp.show_list(
    tokenizer(d['posts'],truncation=True)['input_ids']
)
List Overview:
Total items: 3

1. list[0]
   - Type: list
   - Length: 512
   - Value: [101, 1005, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1053, 2015, 2595, 16257, 8545, 2509, 21638, 2860, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 4601, 1012, 2865, 1012, 10722, 14905, 20974, 1012, 4012, 1013, 10722, 14905, 20974, 1035, 1048, 14876, 26230, 2692, 2509, 9737, 27717, 19062, 2487, 3217, 9541, 2487, 1035, 3156, 1012, 16545, 2290, 1064, 1064, 1064, 4372, 22540, 1998, 20014, 3501, 5312, 16770, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1045, 2480, 2581, 2571, 2487, 2290, 2549, 2595, 2213, 2549, 2998, 13013, 2121, 2025, 2327, 2702, 3248, 16770, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 15384, 20952, 4371, 2487, 12870, 2278, 26418, 2015, 1064, 1064, 1064, 2054, 2038, 2042, 1996, 2087, 2166, 1011, 5278, 3325, 1999, 2115, 2166, 1029, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1058, 2595, 4371, 2100, 2860, 13088, 2094, 2860, 2620, 8299, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1057, 2620, 20518, 3286, 2629, 18927, 2509, 2063, 2006, 9377, 2005, 2087, 1997, 2651, 1012, 1064, 1064, 1064, 2089, 1996, 2566, 2278, 3325, 10047, 16862, 2063, 2017, 1012, 1064, 1064, 1064, 1996, 2197, 2518, 2026, 1999, 2546, 3501, 2767, 6866, 2006, 2010, 9130, 2077, 16873, 5920, 1996, 2279, 2154, 1012, 2717, 1999, 3521, 1066, 8299, 1024, 1013, 1013, 6819, 26247, 1012, 4012, 1013, 22238, 20958, 11387, 2575, 1064, 1064, 1064, 7592, 4372, 2546, 3501, 2581, 1012, 3374, 2000, 2963, 1997, 2115, 12893, 1012, 2009, 1005, 1055, 2069, 3019, 2005, 1037, 3276, 2000, 2025, 2022, 15401, 2035, 1996, 2051, 1999, 2296, 2617, 1997, 4598, 1012, 3046, 2000, 3275, 1996, 2524, 2335, 2004, 2335, 1997, 3930, 1010, 2004, 1012, 1012, 1012, 1064, 1064, 1064, 6391, 22025, 2683, 6391, 23499, 2692, 8299, 1024, 1013, 1013, 2813, 23298, 15194, 3258, 1012, 4012, 1013, 2039, 11066, 1013, 23297, 8889, 1013, 6860, 1011, 2879, 1011, 1998, 1011, 2611, 1011, 2813, 23298, 1012, 16545, 2290, 8299, 1024, 1013, 1013, 7045, 1012, 2079, 19139, 2497, 1012, 4012, 1013, 1059, 2361, 1011, 4180, 1013, 2039, 11066, 2015, 1013, 2230, 1013, 5840, 1013, 2461, 1011, 2188, 1011, 2640, 1012, 16545, 2290, 1012, 1012, 1012, 1064, 1064, 1064, 6160, 1998, 4933, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 2447, 7971, 10127, 1012, 4012, 1013, 1059, 2361, 1011, 4180, 1013, 2039, 11066, 2015, 1013, 2286, 1013, 5511, 1013, 2417, 1011, 2417, 1011, 1996, 1011, 20421, 1011, 3040, 1011, 19652, 16086, 22610, 2549, 1011, 10332, 1011, 27908, 1012, 16545, 2290, 2208, 1012, 2275, 1012, 2674, 1012, 1064, 1064, 1064, 4013, 4143, 2278, 1010, 2092, 19892, 21823, 2078, 1010, 2012, 2560, 4228, 2781, 1997, 3048, 2115, 3456, 1006, 1998, 1045, 2123, 1005, 1056, 2812, 3048, 2068, 2096, 3564, 1999, 2115, 2168, 4624, 3242, 1007, 1010, 17901, 1999, 5549, 8156, 1006, 2672, 3046, 21006, 2015, 2004, 1037, 2740, 3771, 4522, 1012, 1012, 1012, 1064, 1064, 1064, 10468, 2272, 2039, 2007, 2093, 5167, 2017, 1005, 2310, 4340, 2008, 2169, 2828, 1006, 2030, 29221, 4127, 2017, 2215, 102]

2. list[1]
   - Type: list
   - Length: 512
   - Value: [101, 1005, 1045, 1005, 1049, 4531, 1996, 3768, 1997, 2033, 1999, 2122, 8466, 2200, 8598, 2075, 1012, 1064, 1064, 1064, 3348, 2064, 2022, 11771, 2065, 2009, 1005, 1055, 1999, 1996, 2168, 2597, 2411, 1012, 2005, 2742, 2033, 1998, 2026, 6513, 2024, 2747, 1999, 2019, 4044, 2073, 2057, 2031, 2000, 5541, 2135, 2224, 11190, 15239, 1998, 8696, 1012, 2045, 3475, 1005, 1056, 2438, 1012, 1012, 1012, 1064, 1064, 1064, 3228, 2047, 3574, 2000, 1005, 2208, 1005, 3399, 1012, 1064, 1064, 1064, 7592, 1008, 4372, 25856, 5861, 1008, 2008, 1005, 1055, 2035, 2009, 3138, 1012, 2084, 2057, 23705, 1998, 2027, 2079, 2087, 1997, 1996, 20661, 2096, 1045, 13399, 2037, 3739, 1998, 2709, 2037, 2616, 2007, 5744, 2773, 13068, 1998, 2062, 5048, 2100, 20237, 1012, 1064, 1064, 1064, 2023, 1009, 3768, 1997, 5703, 1998, 2192, 3239, 12016, 1012, 1064, 1064, 1064, 2613, 26264, 3231, 1045, 3556, 13029, 1012, 4274, 26264, 5852, 2024, 6057, 1012, 1045, 3556, 8574, 2015, 2030, 3020, 1012, 2085, 1010, 2066, 1996, 2280, 10960, 1997, 2023, 11689, 1045, 2097, 5254, 2008, 1045, 2123, 1005, 1056, 2903, 1999, 1996, 26264, 3231, 1012, 2077, 2017, 7221, 4509, 1012, 1012, 1012, 1064, 1064, 1064, 2017, 2113, 2017, 1005, 2128, 2019, 4372, 25856, 2043, 2017, 25887, 2013, 1037, 2609, 2005, 1037, 2095, 1998, 1037, 2431, 1010, 2709, 1010, 1998, 2424, 2111, 2024, 2145, 15591, 2006, 2115, 8466, 1998, 16663, 2115, 4784, 1013, 4301, 1012, 2017, 2113, 2017, 1005, 2128, 2019, 4372, 25856, 2043, 2017, 1012, 1012, 1012, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 10047, 2290, 15136, 2620, 1012, 4871, 3270, 3600, 1012, 2149, 1013, 10047, 2290, 15136, 2620, 1013, 4185, 19317, 1013, 3438, 11387, 2094, 2487, 2546, 2683, 2850, 2575, 2683, 22932, 2050, 2575, 2497, 2581, 2487, 19473, 2575, 1012, 16545, 2290, 1064, 1064, 1064, 8299, 1024, 1013, 1013, 10047, 2290, 1012, 4639, 2094, 16872, 28014, 1012, 4012, 1013, 6282, 2509, 2050, 2692, 2278, 2575, 18827, 22025, 16932, 3540, 2497, 2620, 2549, 2278, 22203, 1064, 1064, 1064, 1045, 2058, 2228, 2477, 2823, 1012, 1045, 2175, 2011, 1996, 2214, 20052, 9106, 14686, 1012, 3383, 1010, 2043, 1037, 2158, 2038, 2569, 3716, 1998, 2569, 4204, 2066, 2026, 2219, 1010, 2009, 2738, 16171, 2032, 2000, 6148, 1037, 3375, 1012, 1012, 1012, 1064, 1064, 1064, 13789, 12155, 10270, 1012, 10722, 14905, 20974, 1012, 4012, 2061, 2003, 1045, 1024, 1040, 1064, 1064, 1064, 4278, 1010, 2199, 1009, 2695, 1064, 1064, 1064, 2025, 2428, 1025, 1045, 1005, 2310, 2196, 2245, 1997, 1041, 1013, 1045, 2030, 1046, 1013, 1052, 2004, 2613, 4972, 1012, 1045, 3648, 2870, 2006, 2054, 1045, 2224, 1012, 1045, 2224, 11265, 1998, 14841, 2004, 2026, 29532, 1012, 10768, 2005, 6699, 1998, 6524, 9033, 1012, 1045, 2036, 2224, 9152, 2349, 2000, 2033, 3997, 1012, 1012, 1012, 1064, 1064, 1064, 2017, 2113, 2295, 1012, 2008, 2001, 13749, 18595, 3560, 1012, 2044, 3038, 2009, 1045, 2428, 2215, 2000, 3046, 2009, 1998, 2156, 2054, 6433, 2007, 2033, 2652, 1037, 2034, 2711, 13108, 1999, 1996, 2067, 2096, 2057, 3298, 2105, 1012, 1045, 2215, 2000, 2156, 1996, 2298, 2006, 1012, 1012, 1012, 1064, 1064, 1064, 2041, 1997, 2035, 1997, 2068, 1996, 2600, 3259, 2028, 102]

3. list[2]
   - Type: list
   - Length: 512
   - Value: [101, 1005, 2204, 2028, 1035, 1035, 1035, 1035, 1035, 16770, 1024, 1013, 1013, 7479, 1012, 7858, 1012, 4012, 1013, 3422, 1029, 1058, 1027, 1042, 4048, 18259, 28027, 2546, 2290, 2860, 1064, 1064, 1064, 1997, 2607, 1010, 2000, 2029, 1045, 2360, 1045, 2113, 1025, 2008, 1005, 1055, 2026, 13301, 1998, 2026, 8364, 1012, 1064, 1064, 1064, 2515, 2108, 7078, 3893, 2008, 2017, 1998, 2115, 2190, 2767, 2071, 2022, 2019, 6429, 3232, 4175, 1029, 2065, 2061, 1010, 2084, 2748, 1012, 2030, 2009, 1005, 1055, 2062, 1045, 2071, 2022, 29179, 1999, 2293, 1999, 2553, 1045, 28348, 2026, 5346, 1006, 2029, 2012, 1012, 1012, 1012, 1064, 1064, 1064, 2053, 1010, 1045, 2134, 1005, 1056, 1025, 4067, 2017, 2005, 1037, 4957, 999, 1064, 1064, 1064, 2061, 1011, 2170, 14841, 1011, 9033, 7077, 1006, 1998, 2009, 2064, 7872, 2013, 2151, 2783, 8476, 1013, 17418, 1007, 2064, 2022, 9252, 1012, 2009, 1005, 1055, 2066, 2043, 2017, 1005, 2128, 5881, 1999, 2115, 2219, 4301, 1010, 1998, 2115, 2568, 2074, 17677, 2015, 1999, 7925, 1012, 5683, 5621, 6659, 1012, 1012, 1012, 1012, 1064, 1064, 1064, 2031, 2017, 4384, 2129, 14099, 10072, 2064, 2022, 1029, 2035, 2017, 2031, 2000, 2079, 2003, 2298, 2091, 2012, 1996, 5568, 1024, 9877, 1997, 2367, 3269, 2427, 2045, 1012, 1998, 2085, 5674, 2008, 5606, 1997, 2086, 2101, 1006, 2043, 1013, 2065, 5800, 1012, 1012, 1012, 1064, 1064, 1064, 1996, 3044, 2015, 1516, 2196, 2018, 2053, 2028, 2412, 1064, 1064, 1064, 1045, 2411, 2424, 2870, 27963, 5344, 2006, 7720, 13262, 1013, 3536, 1012, 1064, 1064, 1064, 2023, 1019, 2095, 1011, 2214, 6251, 2003, 2019, 11757, 8321, 1998, 3376, 6412, 1012, 1064, 1064, 1064, 1045, 4033, 1005, 1056, 4716, 2023, 4037, 1999, 1996, 2197, 1017, 2086, 1012, 2061, 9444, 9631, 2023, 1006, 1998, 2672, 2130, 17749, 2033, 1010, 2029, 1045, 3811, 4797, 1007, 1024, 7632, 1012, 6352, 2692, 26224, 6352, 2692, 28311, 1064, 1064, 1064, 2043, 2017, 4133, 1999, 2115, 3871, 2127, 2184, 1024, 2382, 7610, 3015, 2774, 1010, 1998, 6170, 2068, 1006, 2362, 2007, 9877, 1997, 4533, 2015, 1007, 2096, 2652, 2115, 6490, 2858, 1012, 1064, 1064, 1064, 2023, 2003, 1996, 2087, 20014, 2361, 1011, 2003, 2232, 11689, 1045, 1005, 2310, 2412, 2464, 1012, 1064, 1064, 1064, 1045, 2876, 1005, 1056, 2022, 2583, 2000, 2298, 2012, 1996, 4169, 2005, 1996, 2972, 2166, 2065, 1045, 2354, 2008, 1045, 3856, 2009, 2058, 1996, 2529, 2108, 1012, 1064, 1064, 1064, 1045, 2001, 5059, 1037, 4281, 2005, 2026, 7284, 2006, 2029, 1045, 1005, 1049, 2551, 2157, 2085, 1011, 2009, 2323, 2031, 2042, 7733, 1012, 1012, 2021, 1045, 2371, 27885, 14715, 3064, 2000, 2191, 2928, 28194, 5420, 2595, 2683, 2475, 2015, 2695, 11522, 2013, 2009, 1024, 1040, 2065, 2017, 3191, 1996, 2338, 1012, 1012, 1012, 1064, 1064, 1064, 1045, 2318, 2000, 2191, 5888, 2055, 13170, 5146, 1998, 21830, 9610, 7834, 1011, 2182, 2017, 2064, 2156, 2048, 2034, 3441, 1024, 16770, 1024, 1013, 1013, 7479, 1012, 10722, 14905, 20974, 1012, 4012, 1013, 9927, 1013, 1011, 4074, 20348, 29159, 1011, 1064, 1064, 1064, 20014, 3501, 3728, 1045, 2318, 2000, 2695, 2026, 5888, 2055, 2048, 2814, 1011, 13170, 5146, 1998, 21830, 9610, 7834, 1012, 2077, 102]

성공

model1(torch.tensor(tokenizer(d['posts'],truncation=True)['input_ids']))
#model1(tokenizer(d['posts'],truncation=True,return_tensors="pt")['input_ids'])
SequenceClassifierOutput(loss=None, logits=tensor([[-0.0450,  0.1728],
        [-0.0259,  0.1809],
        [-0.0611,  0.1994]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

(풀이2) –모델설정변경 (퀴즈5, 모델의 프레임수를 4로 바꾸는 예제에서 사용한 테크닉)

distilbert/distilbert-base-uncased 설정값 부르기

config = transformers.AutoConfig.from_pretrained(
    "distilbert/distilbert-base-uncased"
)
config
DistilBertConfig {
  "_name_or_path": "distilbert/distilbert-base-uncased",
  "activation": "gelu",
  "architectures": [
    "DistilBertForMaskedLM"
  ],
  "attention_dropout": 0.1,
  "dim": 768,
  "dropout": 0.1,
  "hidden_dim": 3072,
  "initializer_range": 0.02,
  "max_position_embeddings": 512,
  "model_type": "distilbert",
  "n_heads": 12,
  "n_layers": 6,
  "pad_token_id": 0,
  "qa_dropout": 0.1,
  "seq_classif_dropout": 0.2,
  "sinusoidal_pos_embds": false,
  "tie_weights_": true,
  "transformers_version": "4.46.2",
  "vocab_size": 30522
}

설정값변경

config.max_position_embeddings = 2200

설정값으로 모델불러오기

model1_large = transformers.AutoModelForSequenceClassification.from_config(
    config=config
)

모델사용

model1_large(torch.tensor(tokenizer(d['posts'],padding=True)['input_ids']))
SequenceClassifierOutput(loss=None, logits=tensor([[ 0.1023,  0.1124],
        [ 0.1443, -0.0719],
        [ 0.2653, -0.0368]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)
model1_large(**tokenizer(d['posts'],padding=True,return_tensors="pt"))
SequenceClassifierOutput(loss=None, logits=tensor([[0.1730, 0.0042],
        [0.2185, 0.0896],
        [0.3406, 0.0155]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

#

# 예제4 – sms_spam

sms_spam = datasets.load_dataset('sms_spam')['train'].train_test_split(test_size=0.2, seed=42)
sms_spam
DatasetDict({
    train: Dataset({
        features: ['sms', 'label'],
        num_rows: 4459
    })
    test: Dataset({
        features: ['sms', 'label'],
        num_rows: 1115
    })
})
d = sms_spam['train'].select(range(3))
d
Dataset({
    features: ['sms', 'label'],
    num_rows: 3
})

(풀이)

model1(**tokenizer(d['sms'],padding=True,return_tensors="pt"))
SequenceClassifierOutput(loss=None, logits=tensor([[-0.0118,  0.1699],
        [ 0.0273,  0.1815],
        [-0.0208,  0.2202]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

#

B. 이미지

model2 = transformers.AutoModelForImageClassification.from_pretrained(
    "google/vit-base-patch16-224-in21k",
    num_labels=3 # 그냥 대충 3이라고 했음.. 별 이유는 없음
)
Some weights of ViTForImageClassification were not initialized from the model checkpoint at google/vit-base-patch16-224-in21k and are newly initialized: ['classifier.bias', 'classifier.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

# 예제1 – food101

d = datasets.load_dataset("food101", split="train[:4]")
d
Dataset({
    features: ['image', 'label'],
    num_rows: 4
})

(예비학습)torchvision.transforms 에서 제공하는 기능들은 배치처리가 가능한가?

to_tensor = torchvision.transforms.ToTensor()
to_tensor(d['image'][0])
tensor([[[0.1216, 0.1137, 0.1098,  ..., 0.0039, 0.0039, 0.0000],
         [0.1255, 0.1216, 0.1176,  ..., 0.0039, 0.0039, 0.0000],
         [0.1294, 0.1255, 0.1255,  ..., 0.0039, 0.0000, 0.0000],
         ...,
         [0.2588, 0.2745, 0.2863,  ..., 0.3765, 0.3882, 0.3922],
         [0.2353, 0.2471, 0.2667,  ..., 0.3373, 0.3373, 0.3373],
         [0.2235, 0.2275, 0.2471,  ..., 0.3333, 0.3176, 0.3059]],

        [[0.1373, 0.1294, 0.1255,  ..., 0.1020, 0.1020, 0.0980],
         [0.1412, 0.1373, 0.1333,  ..., 0.1020, 0.1020, 0.0980],
         [0.1451, 0.1412, 0.1412,  ..., 0.1020, 0.0980, 0.0980],
         ...,
         [0.2471, 0.2627, 0.2745,  ..., 0.3647, 0.3765, 0.3882],
         [0.2235, 0.2353, 0.2549,  ..., 0.3255, 0.3333, 0.3333],
         [0.2118, 0.2157, 0.2353,  ..., 0.3216, 0.3137, 0.3020]],

        [[0.1412, 0.1333, 0.1294,  ..., 0.0902, 0.0902, 0.0863],
         [0.1451, 0.1412, 0.1451,  ..., 0.0902, 0.0902, 0.0863],
         [0.1490, 0.1451, 0.1529,  ..., 0.0902, 0.0863, 0.0863],
         ...,
         [0.1725, 0.1882, 0.2000,  ..., 0.2431, 0.2549, 0.2667],
         [0.1490, 0.1608, 0.1804,  ..., 0.2039, 0.2118, 0.2118],
         [0.1373, 0.1412, 0.1608,  ..., 0.2000, 0.1922, 0.1804]]])
to_tensor(d['image'])
TypeError: pic should be PIL Image or ndarray. Got <class 'list'>

(풀이)

compose = torchvision.transforms.Compose([
    torchvision.transforms.ToTensor(),
    torchvision.transforms.Resize((224,224))
])
torch.stack(list(map(compose,d['image'])),axis=0).shape
torch.Size([4, 3, 224, 224])
model2.forward(
    torch.stack(list(map(compose,d['image'])),axis=0)
)
ImageClassifierOutput(loss=None, logits=tensor([[ 0.0472,  0.0317, -0.1932],
        [ 0.2093,  0.1209, -0.0129],
        [ 0.0555,  0.0957, -0.0811],
        [-0.0682,  0.0363,  0.0317]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

#

# 예제2

beans = datasets.load_dataset('beans')
d = beans['train'].select(range(4))
d
Dataset({
    features: ['image_file_path', 'image', 'labels'],
    num_rows: 4
})

(풀이)

model2(torch.stack(list(map(compose,d['image'])),axis=0))
ImageClassifierOutput(loss=None, logits=tensor([[ 0.0545, -0.0993, -0.0386],
        [ 0.0576, -0.1576, -0.0019],
        [ 0.0982, -0.1531, -0.1147],
        [ 0.0570, -0.0575, -0.1455]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)

#

C. 동영상

model3 = transformers.VideoMAEForVideoClassification.from_pretrained(
    "MCG-NJU/videomae-base",
)
Some weights of VideoMAEForVideoClassification were not initialized from the model checkpoint at MCG-NJU/videomae-base and are newly initialized: ['classifier.bias', 'classifier.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

# 예제1 – UCF101_subset

file_path = huggingface_hub.hf_hub_download(
    repo_id="sayakpaul/ucf101-subset",
    filename="UCF101_subset.tar.gz",
    repo_type="dataset"
)
# file_path는 다운로드한 압축파일이 존재하는 경로와 파일명이 string으로 저장되어있음.
with tarfile.open(file_path) as t:
     t.extractall("./data") # 여기에서 "."은 현재폴더라는 의미
mp.tree("./data")
└── UCF101_subset
    ├── test
    │   ├── ApplyEyeMakeup
    │   │   ├── UCF101
    │   │   ├── v_ApplyEyeMakeup_g03_c01.avi
    │   │   └── ...
    │   │   └── v_ApplyEyeMakeup_g23_c06.avi
    │   ├── ApplyLipstick
    │   │   ├── UCF101
    │   │   ├── v_ApplyLipstick_g14_c01.avi
    │   │   └── ...
    │   │   └── v_ApplyLipstick_g16_c04.avi
    │   └── ...
    │   └── BenchPress
    │       ├── UCF101
    │       ├── v_BenchPress_g05_c02.avi
    │       └── ...
    │       └── v_BenchPress_g25_c06.avi
    ├── train
    │   ├── ApplyEyeMakeup
    │   │   ├── UCF101
    │   │   ├── v_ApplyEyeMakeup_g02_c03.avi
    │   │   └── ...
    │   │   └── v_ApplyEyeMakeup_g25_c07.avi
    │   ├── ApplyLipstick
    │   │   ├── UCF101
    │   │   ├── v_ApplyLipstick_g01_c02.avi
    │   │   └── ...
    │   │   └── v_ApplyLipstick_g24_c05.avi
    │   └── ...
    │   └── BenchPress
    │       ├── UCF101
    │       ├── v_BenchPress_g01_c05.avi
    │       └── ...
    │       └── v_BenchPress_g24_c05.avi
    └── val
        ├── ApplyEyeMakeup
        │   ├── UCF101
        │   ├── v_ApplyEyeMakeup_g01_c01.avi
        │   ├── v_ApplyEyeMakeup_g14_c05.avi
        │   └── v_ApplyEyeMakeup_g20_c04.avi
        ├── ApplyLipstick
        │   ├── UCF101
        │   ├── v_ApplyLipstick_g10_c04.avi
        │   ├── v_ApplyLipstick_g20_c04.avi
        │   └── v_ApplyLipstick_g25_c02.avi
        └── ...
        └── BenchPress
            ├── UCF101
            ├── v_BenchPress_g11_c05.avi
            ├── v_BenchPress_g17_c02.avi
            └── v_BenchPress_g17_c06.avi
video_path = "./data/UCF101_subset/test/BenchPress/v_BenchPress_g05_c02.avi"
video = pytorchvideo.data.encoded_video.EncodedVideo.from_path(video_path).get_clip(0, float('inf'))['video']
video.shape
torch.Size([3, 67, 240, 320])

(풀이)

model3(
    #video.permute(1,0,2,3)[:16,:,:224,:224].unsqueeze(0)
    #video.permute(1,0,2,3)[:16,:,:224,:224].reshape(1,16,3,224,224)
    torch.stack([video.permute(1,0,2,3)[:16,:,:224,:224]])
)
ImageClassifierOutput(loss=None, logits=tensor([[-0.1375,  0.4270]], grad_fn=<AddmmBackward0>), hidden_states=None, attentions=None)