May 11 20:50:54.665811 ip-10-0-133-205 systemd[1]: Starting Kubernetes Kubelet... May 11 20:50:55.106383 ip-10-0-133-205 kubenswrapper[2555]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 11 20:50:55.106383 ip-10-0-133-205 kubenswrapper[2555]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. May 11 20:50:55.106383 ip-10-0-133-205 kubenswrapper[2555]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 11 20:50:55.106383 ip-10-0-133-205 kubenswrapper[2555]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 11 20:50:55.106383 ip-10-0-133-205 kubenswrapper[2555]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 11 20:50:55.110035 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.109906 2555 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 11 20:50:55.116418 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116379 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:55.116418 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116399 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:55.116418 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116415 2555 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:55.116418 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116418 2555 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:55.116418 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116422 2555 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:55.116418 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116425 2555 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:55.116418 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116428 2555 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116431 2555 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116434 2555 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116437 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116439 2555 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116442 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116445 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116448 2555 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116450 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116453 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116455 2555 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116458 2555 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116460 2555 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116463 2555 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116465 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116468 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116471 2555 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116474 2555 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116476 2555 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116479 2555 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:55.116671 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116481 2555 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116484 2555 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116487 2555 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116496 2555 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116499 2555 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116501 2555 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116504 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116506 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116509 2555 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116512 2555 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116514 2555 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116517 2555 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116520 2555 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116522 2555 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116526 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116528 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116532 2555 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116535 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116538 2555 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116540 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:55.117133 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116543 2555 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116545 2555 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116548 2555 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116551 2555 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116553 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116556 2555 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116559 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116561 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116563 2555 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116566 2555 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116568 2555 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116571 2555 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116574 2555 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116576 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116579 2555 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116581 2555 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116583 2555 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116586 2555 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116588 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116591 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:55.117651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116593 2555 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116596 2555 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116599 2555 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116601 2555 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116604 2555 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116607 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116610 2555 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116613 2555 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116615 2555 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116621 2555 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116625 2555 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116628 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116632 2555 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116635 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116638 2555 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116641 2555 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116643 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116647 2555 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116652 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:55.118153 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.116655 2555 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117059 2555 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117064 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117067 2555 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117070 2555 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117074 2555 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117076 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117079 2555 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117082 2555 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117085 2555 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117088 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117091 2555 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117094 2555 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117097 2555 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117099 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117102 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117105 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117108 2555 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117110 2555 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117113 2555 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:55.118651 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117116 2555 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117119 2555 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117121 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117124 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117126 2555 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117129 2555 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117132 2555 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117135 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117137 2555 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117140 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117143 2555 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117146 2555 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117149 2555 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117151 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117154 2555 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117156 2555 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117159 2555 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117162 2555 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117164 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117167 2555 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:55.119154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117169 2555 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117172 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117176 2555 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117180 2555 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117183 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117185 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117188 2555 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117191 2555 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117193 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117195 2555 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117198 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117201 2555 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117203 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117205 2555 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117208 2555 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117210 2555 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117213 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117215 2555 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117220 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117222 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:55.119691 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117225 2555 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117228 2555 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117231 2555 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117233 2555 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117236 2555 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117239 2555 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117241 2555 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117244 2555 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117247 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117249 2555 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117251 2555 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117254 2555 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117257 2555 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117259 2555 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117262 2555 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117264 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117267 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117270 2555 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117272 2555 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117274 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:55.120165 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117277 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117280 2555 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117283 2555 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117286 2555 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117290 2555 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117293 2555 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.117295 2555 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118469 2555 flags.go:64] FLAG: --address="0.0.0.0" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118479 2555 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118485 2555 flags.go:64] FLAG: --anonymous-auth="true" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118490 2555 flags.go:64] FLAG: --application-metrics-count-limit="100" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118495 2555 flags.go:64] FLAG: --authentication-token-webhook="false" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118498 2555 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118503 2555 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118507 2555 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118510 2555 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118514 2555 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118517 2555 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118521 2555 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118525 2555 flags.go:64] FLAG: --cgroup-driver="cgroupfs" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118528 2555 flags.go:64] FLAG: --cgroup-root="" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118531 2555 flags.go:64] FLAG: --cgroups-per-qos="true" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118534 2555 flags.go:64] FLAG: --client-ca-file="" May 11 20:50:55.120646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118543 2555 flags.go:64] FLAG: --cloud-config="" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118548 2555 flags.go:64] FLAG: --cloud-provider="external" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118551 2555 flags.go:64] FLAG: --cluster-dns="[]" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118556 2555 flags.go:64] FLAG: --cluster-domain="" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118559 2555 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118562 2555 flags.go:64] FLAG: --config-dir="" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118565 2555 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118569 2555 flags.go:64] FLAG: --container-log-max-files="5" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118597 2555 flags.go:64] FLAG: --container-log-max-size="10Mi" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118600 2555 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118604 2555 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118607 2555 flags.go:64] FLAG: --containerd-namespace="k8s.io" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118610 2555 flags.go:64] FLAG: --contention-profiling="false" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118613 2555 flags.go:64] FLAG: --cpu-cfs-quota="true" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118616 2555 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118619 2555 flags.go:64] FLAG: --cpu-manager-policy="none" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118622 2555 flags.go:64] FLAG: --cpu-manager-policy-options="" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118627 2555 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118630 2555 flags.go:64] FLAG: --enable-controller-attach-detach="true" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118633 2555 flags.go:64] FLAG: --enable-debugging-handlers="true" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118636 2555 flags.go:64] FLAG: --enable-load-reader="false" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118640 2555 flags.go:64] FLAG: --enable-server="true" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118643 2555 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118647 2555 flags.go:64] FLAG: --event-burst="100" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118650 2555 flags.go:64] FLAG: --event-qps="50" May 11 20:50:55.121194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118653 2555 flags.go:64] FLAG: --event-storage-age-limit="default=0" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118657 2555 flags.go:64] FLAG: --event-storage-event-limit="default=0" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118660 2555 flags.go:64] FLAG: --eviction-hard="" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118664 2555 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118668 2555 flags.go:64] FLAG: --eviction-minimum-reclaim="" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118671 2555 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118674 2555 flags.go:64] FLAG: --eviction-soft="" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118677 2555 flags.go:64] FLAG: --eviction-soft-grace-period="" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118680 2555 flags.go:64] FLAG: --exit-on-lock-contention="false" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118683 2555 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118686 2555 flags.go:64] FLAG: --experimental-mounter-path="" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118689 2555 flags.go:64] FLAG: --fail-cgroupv1="false" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118692 2555 flags.go:64] FLAG: --fail-swap-on="true" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118695 2555 flags.go:64] FLAG: --feature-gates="" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118699 2555 flags.go:64] FLAG: --file-check-frequency="20s" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118702 2555 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118705 2555 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118709 2555 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118712 2555 flags.go:64] FLAG: --healthz-port="10248" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118715 2555 flags.go:64] FLAG: --help="false" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118718 2555 flags.go:64] FLAG: --hostname-override="ip-10-0-133-205.ec2.internal" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118721 2555 flags.go:64] FLAG: --housekeeping-interval="10s" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118724 2555 flags.go:64] FLAG: --http-check-frequency="20s" May 11 20:50:55.121784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118727 2555 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118731 2555 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118735 2555 flags.go:64] FLAG: --image-gc-high-threshold="85" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118738 2555 flags.go:64] FLAG: --image-gc-low-threshold="80" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118741 2555 flags.go:64] FLAG: --image-service-endpoint="" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118744 2555 flags.go:64] FLAG: --kernel-memcg-notification="false" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118747 2555 flags.go:64] FLAG: --kube-api-burst="100" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118750 2555 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118754 2555 flags.go:64] FLAG: --kube-api-qps="50" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118757 2555 flags.go:64] FLAG: --kube-reserved="" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118760 2555 flags.go:64] FLAG: --kube-reserved-cgroup="" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118763 2555 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118766 2555 flags.go:64] FLAG: --kubelet-cgroups="" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118769 2555 flags.go:64] FLAG: --local-storage-capacity-isolation="true" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118772 2555 flags.go:64] FLAG: --lock-file="" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118775 2555 flags.go:64] FLAG: --log-cadvisor-usage="false" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118778 2555 flags.go:64] FLAG: --log-flush-frequency="5s" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118781 2555 flags.go:64] FLAG: --log-json-info-buffer-size="0" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118787 2555 flags.go:64] FLAG: --log-json-split-stream="false" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118790 2555 flags.go:64] FLAG: --log-text-info-buffer-size="0" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118793 2555 flags.go:64] FLAG: --log-text-split-stream="false" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118796 2555 flags.go:64] FLAG: --logging-format="text" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118799 2555 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118802 2555 flags.go:64] FLAG: --make-iptables-util-chains="true" May 11 20:50:55.122308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118805 2555 flags.go:64] FLAG: --manifest-url="" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118808 2555 flags.go:64] FLAG: --manifest-url-header="" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118813 2555 flags.go:64] FLAG: --max-housekeeping-interval="15s" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118817 2555 flags.go:64] FLAG: --max-open-files="1000000" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118821 2555 flags.go:64] FLAG: --max-pods="110" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118824 2555 flags.go:64] FLAG: --maximum-dead-containers="-1" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118828 2555 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118831 2555 flags.go:64] FLAG: --memory-manager-policy="None" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118834 2555 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118837 2555 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118840 2555 flags.go:64] FLAG: --node-ip="0.0.0.0" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118843 2555 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118851 2555 flags.go:64] FLAG: --node-status-max-images="50" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118854 2555 flags.go:64] FLAG: --node-status-update-frequency="10s" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118857 2555 flags.go:64] FLAG: --oom-score-adj="-999" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118861 2555 flags.go:64] FLAG: --pod-cidr="" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118864 2555 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3fc6c2cc09f271efd3cd2adb6c984c7cab48ea53dad824c952dee91afa8eaa20" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118869 2555 flags.go:64] FLAG: --pod-manifest-path="" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118876 2555 flags.go:64] FLAG: --pod-max-pids="-1" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118879 2555 flags.go:64] FLAG: --pods-per-core="0" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118882 2555 flags.go:64] FLAG: --port="10250" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118885 2555 flags.go:64] FLAG: --protect-kernel-defaults="false" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118888 2555 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0453d0e0781fece5d" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118891 2555 flags.go:64] FLAG: --qos-reserved="" May 11 20:50:55.122926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118894 2555 flags.go:64] FLAG: --read-only-port="10255" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118897 2555 flags.go:64] FLAG: --register-node="true" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118900 2555 flags.go:64] FLAG: --register-schedulable="true" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118903 2555 flags.go:64] FLAG: --register-with-taints="" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118907 2555 flags.go:64] FLAG: --registry-burst="10" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118910 2555 flags.go:64] FLAG: --registry-qps="5" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118912 2555 flags.go:64] FLAG: --reserved-cpus="" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118915 2555 flags.go:64] FLAG: --reserved-memory="" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118919 2555 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118922 2555 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118925 2555 flags.go:64] FLAG: --rotate-certificates="false" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118929 2555 flags.go:64] FLAG: --rotate-server-certificates="false" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118932 2555 flags.go:64] FLAG: --runonce="false" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118934 2555 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118938 2555 flags.go:64] FLAG: --runtime-request-timeout="2m0s" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118941 2555 flags.go:64] FLAG: --seccomp-default="false" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118944 2555 flags.go:64] FLAG: --serialize-image-pulls="true" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118947 2555 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118950 2555 flags.go:64] FLAG: --storage-driver-db="cadvisor" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118953 2555 flags.go:64] FLAG: --storage-driver-host="localhost:8086" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118956 2555 flags.go:64] FLAG: --storage-driver-password="root" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118959 2555 flags.go:64] FLAG: --storage-driver-secure="false" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118962 2555 flags.go:64] FLAG: --storage-driver-table="stats" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118965 2555 flags.go:64] FLAG: --storage-driver-user="root" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118968 2555 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118972 2555 flags.go:64] FLAG: --sync-frequency="1m0s" May 11 20:50:55.123511 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118976 2555 flags.go:64] FLAG: --system-cgroups="" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118979 2555 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118984 2555 flags.go:64] FLAG: --system-reserved-cgroup="" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118987 2555 flags.go:64] FLAG: --tls-cert-file="" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118990 2555 flags.go:64] FLAG: --tls-cipher-suites="[]" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118995 2555 flags.go:64] FLAG: --tls-min-version="" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.118998 2555 flags.go:64] FLAG: --tls-private-key-file="" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.119001 2555 flags.go:64] FLAG: --topology-manager-policy="none" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.119004 2555 flags.go:64] FLAG: --topology-manager-policy-options="" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.119007 2555 flags.go:64] FLAG: --topology-manager-scope="container" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.119010 2555 flags.go:64] FLAG: --v="2" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.119014 2555 flags.go:64] FLAG: --version="false" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.119019 2555 flags.go:64] FLAG: --vmodule="" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.119023 2555 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.119026 2555 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119162 2555 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119166 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119170 2555 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119173 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119176 2555 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119179 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119182 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119186 2555 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:55.124142 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119189 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119191 2555 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119194 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119197 2555 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119200 2555 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119203 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119206 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119209 2555 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119213 2555 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119216 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119219 2555 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119221 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119224 2555 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119227 2555 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119229 2555 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119232 2555 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119235 2555 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119237 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119240 2555 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119243 2555 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:55.124673 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119245 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119248 2555 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119250 2555 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119254 2555 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119256 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119259 2555 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119266 2555 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119269 2555 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119272 2555 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119274 2555 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119277 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119280 2555 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119282 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119285 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119288 2555 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119291 2555 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119293 2555 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119296 2555 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119300 2555 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:55.125226 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119303 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119306 2555 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119310 2555 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119313 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119316 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119319 2555 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119322 2555 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119324 2555 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119327 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119330 2555 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119332 2555 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119335 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119338 2555 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119340 2555 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119343 2555 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119345 2555 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119348 2555 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119350 2555 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119353 2555 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119357 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:55.125700 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119359 2555 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119362 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119364 2555 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119367 2555 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119370 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119373 2555 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119376 2555 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119378 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119382 2555 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119386 2555 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119388 2555 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119391 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119394 2555 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119397 2555 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119399 2555 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119417 2555 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119420 2555 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119423 2555 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:55.126169 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.119425 2555 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:55.126648 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.119433 2555 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 11 20:50:55.127265 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.127246 2555 server.go:530] "Kubelet version" kubeletVersion="v1.33.10" May 11 20:50:55.127293 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.127266 2555 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 11 20:50:55.127336 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127327 2555 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:55.127336 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127335 2555 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127339 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127342 2555 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127345 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127348 2555 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127351 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127355 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127358 2555 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127361 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127364 2555 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127367 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127371 2555 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127374 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127376 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127379 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127382 2555 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127385 2555 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127388 2555 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127391 2555 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127394 2555 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:55.127390 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127397 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127415 2555 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127418 2555 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127421 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127425 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127428 2555 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127431 2555 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127433 2555 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127436 2555 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127439 2555 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127441 2555 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127444 2555 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127447 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127450 2555 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127452 2555 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127455 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127457 2555 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127460 2555 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127462 2555 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127465 2555 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:55.127897 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127468 2555 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127470 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127474 2555 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127478 2555 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127481 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127484 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127486 2555 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127489 2555 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127491 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127494 2555 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127497 2555 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127499 2555 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127502 2555 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127506 2555 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127511 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127514 2555 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127517 2555 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127520 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127522 2555 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:55.128355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127525 2555 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127527 2555 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127530 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127533 2555 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127535 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127538 2555 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127541 2555 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127543 2555 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127546 2555 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127548 2555 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127551 2555 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127554 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127556 2555 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127559 2555 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127561 2555 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127564 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127567 2555 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127569 2555 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127572 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:55.128879 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127575 2555 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127577 2555 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127580 2555 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127583 2555 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127585 2555 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127588 2555 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127591 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.127596 2555 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127710 2555 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127715 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127719 2555 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127722 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127725 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127728 2555 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127743 2555 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127746 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:55.129355 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127749 2555 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127752 2555 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127754 2555 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127757 2555 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127761 2555 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127764 2555 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127767 2555 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127769 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127772 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127774 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127777 2555 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127779 2555 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127782 2555 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127784 2555 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127787 2555 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127789 2555 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127792 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127794 2555 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127797 2555 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:55.129751 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127800 2555 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127802 2555 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127805 2555 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127807 2555 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127810 2555 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127812 2555 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127815 2555 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127818 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127820 2555 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127823 2555 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127826 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127829 2555 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127831 2555 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127834 2555 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127836 2555 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127839 2555 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127841 2555 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127844 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127846 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127849 2555 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:55.130194 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127851 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127855 2555 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127858 2555 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127861 2555 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127864 2555 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127867 2555 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127870 2555 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127872 2555 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127875 2555 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127878 2555 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127880 2555 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127883 2555 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127886 2555 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127888 2555 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127891 2555 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127893 2555 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127896 2555 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127899 2555 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127902 2555 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:55.130750 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127905 2555 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127907 2555 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127910 2555 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127913 2555 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127919 2555 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127921 2555 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127924 2555 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127926 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127928 2555 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127931 2555 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127933 2555 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127936 2555 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127938 2555 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127941 2555 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127943 2555 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127945 2555 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127948 2555 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127950 2555 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127953 2555 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:55.131222 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:55.127955 2555 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:55.131693 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.127961 2555 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 11 20:50:55.131693 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.128525 2555 server.go:962] "Client rotation is on, will bootstrap in background" May 11 20:50:55.131693 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.130520 2555 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" May 11 20:50:55.131693 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.131675 2555 server.go:1019] "Starting client certificate rotation" May 11 20:50:55.131828 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.131771 2555 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" May 11 20:50:55.132768 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.132757 2555 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" May 11 20:50:55.155990 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.155970 2555 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 11 20:50:55.158841 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.158816 2555 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 11 20:50:55.177656 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.177635 2555 log.go:25] "Validated CRI v1 runtime API" May 11 20:50:55.183952 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.183935 2555 log.go:25] "Validated CRI v1 image API" May 11 20:50:55.185212 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.185195 2555 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 11 20:50:55.187646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.187631 2555 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 11 20:50:55.189556 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.189538 2555 fs.go:135] Filesystem UUIDs: map[43cf3581-7124-4dd9-9767-846bb0b76757:/dev/nvme0n1p4 556b85ab-5778-4fe6-8f3b-7bf01a434fac:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] May 11 20:50:55.189604 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.189556 2555 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] May 11 20:50:55.195136 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.195033 2555 manager.go:217] Machine: {Timestamp:2026-05-11 20:50:55.193175784 +0000 UTC m=+0.405118935 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3153888 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a47e01e71ea4511a6efe64daa9fd5 SystemUUID:ec2a47e0-1e71-ea45-11a6-efe64daa9fd5 BootID:64c3e2ea-9b8f-4957-b76d-93867843e9aa Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:20:72:b5:22:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:20:72:b5:22:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:43:4c:8c:11:c3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} May 11 20:50:55.195136 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.195132 2555 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. May 11 20:50:55.195252 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.195241 2555 manager.go:233] Version: {KernelVersion:5.14.0-570.112.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260504-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} May 11 20:50:55.196298 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.196273 2555 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 11 20:50:55.196448 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.196299 2555 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-205.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 11 20:50:55.196499 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.196459 2555 topology_manager.go:138] "Creating topology manager with none policy" May 11 20:50:55.196499 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.196466 2555 container_manager_linux.go:306] "Creating device plugin manager" May 11 20:50:55.196499 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.196483 2555 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" May 11 20:50:55.198217 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.198207 2555 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" May 11 20:50:55.198961 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.198950 2555 state_mem.go:36] "Initialized new in-memory state store" May 11 20:50:55.199208 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.199199 2555 server.go:1267] "Using root directory" path="/var/lib/kubelet" May 11 20:50:55.202275 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.202266 2555 kubelet.go:491] "Attempting to sync node with API server" May 11 20:50:55.202310 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.202280 2555 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" May 11 20:50:55.202310 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.202295 2555 file.go:69] "Watching path" path="/etc/kubernetes/manifests" May 11 20:50:55.202310 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.202304 2555 kubelet.go:397] "Adding apiserver pod source" May 11 20:50:55.202383 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.202312 2555 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 11 20:50:55.203507 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.203497 2555 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 11 20:50:55.203545 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.203515 2555 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 11 20:50:55.205916 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.205895 2555 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v77qd" May 11 20:50:55.206603 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.206590 2555 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.11-2.rhaos4.20.gitb2a8320.el9" apiVersion="v1" May 11 20:50:55.207908 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.207895 2555 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 11 20:50:55.209730 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209718 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" May 11 20:50:55.209777 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209741 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" May 11 20:50:55.209777 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209748 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" May 11 20:50:55.209777 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209753 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" May 11 20:50:55.209777 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209759 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" May 11 20:50:55.209777 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209765 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" May 11 20:50:55.209777 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209770 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" May 11 20:50:55.209777 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209776 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" May 11 20:50:55.209948 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209783 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" May 11 20:50:55.209948 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209789 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" May 11 20:50:55.209948 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209803 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" May 11 20:50:55.209948 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.209812 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" May 11 20:50:55.211399 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.211386 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" May 11 20:50:55.211399 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.211414 2555 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" May 11 20:50:55.211755 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.211740 2555 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v77qd" May 11 20:50:55.212920 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.212868 2555 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 11 20:50:55.213212 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.213182 2555 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-205.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 11 20:50:55.215296 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.215284 2555 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 11 20:50:55.215364 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.215325 2555 server.go:1295] "Started kubelet" May 11 20:50:55.215497 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.215441 2555 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 11 20:50:55.215534 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.215454 2555 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 11 20:50:55.215568 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.215559 2555 server_v1.go:47] "podresources" method="list" useActivePods=true May 11 20:50:55.216253 ip-10-0-133-205 systemd[1]: Started Kubernetes Kubelet. May 11 20:50:55.216653 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.216448 2555 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 11 20:50:55.222550 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.222522 2555 server.go:317] "Adding debug handlers to kubelet server" May 11 20:50:55.227116 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.227090 2555 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" May 11 20:50:55.227630 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.227576 2555 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 11 20:50:55.228317 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.228302 2555 volume_manager.go:295] "The desired_state_of_world populator starts" May 11 20:50:55.228317 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.228320 2555 volume_manager.go:297] "Starting Kubelet Volume Manager" May 11 20:50:55.228494 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.228391 2555 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 11 20:50:55.228494 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.228449 2555 reconstruct.go:97] "Volume reconstruction finished" May 11 20:50:55.228494 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.228454 2555 reconciler.go:26] "Reconciler: start to sync state" May 11 20:50:55.228690 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.228671 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:55.228984 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.228851 2555 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" May 11 20:50:55.229046 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.228981 2555 factory.go:55] Registering systemd factory May 11 20:50:55.229046 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.229002 2555 factory.go:223] Registration of the systemd container factory successfully May 11 20:50:55.229291 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.229272 2555 factory.go:153] Registering CRI-O factory May 11 20:50:55.229345 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.229293 2555 factory.go:223] Registration of the crio container factory successfully May 11 20:50:55.229391 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.229382 2555 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory May 11 20:50:55.229468 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.229427 2555 factory.go:103] Registering Raw factory May 11 20:50:55.229468 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.229445 2555 manager.go:1196] Started watching for new ooms in manager May 11 20:50:55.229818 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.229802 2555 manager.go:319] Starting recovery of all containers May 11 20:50:55.230979 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.230964 2555 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:55.232255 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.232232 2555 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-205.ec2.internal\" not found" node="ip-10-0-133-205.ec2.internal" May 11 20:50:55.232336 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.232236 2555 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-205.ec2.internal" not found May 11 20:50:55.239896 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.239886 2555 manager.go:324] Recovery completed May 11 20:50:55.243866 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.243851 2555 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:55.247789 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.247774 2555 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-205.ec2.internal" not found May 11 20:50:55.248203 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.248190 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:55.248262 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.248217 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:55.248262 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.248227 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:55.248744 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.248730 2555 cpu_manager.go:222] "Starting CPU manager" policy="none" May 11 20:50:55.248744 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.248741 2555 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" May 11 20:50:55.248839 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.248756 2555 state_mem.go:36] "Initialized new in-memory state store" May 11 20:50:55.251014 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.251002 2555 policy_none.go:49] "None policy: Start" May 11 20:50:55.251052 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.251018 2555 memory_manager.go:186] "Starting memorymanager" policy="None" May 11 20:50:55.251052 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.251028 2555 state_mem.go:35] "Initializing new in-memory state store" May 11 20:50:55.297002 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.296984 2555 manager.go:341] "Starting Device Plugin manager" May 11 20:50:55.298361 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.297054 2555 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 11 20:50:55.298361 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.297067 2555 server.go:85] "Starting device plugin registration server" May 11 20:50:55.298361 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.297285 2555 eviction_manager.go:189] "Eviction manager: starting control loop" May 11 20:50:55.298361 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.297294 2555 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 11 20:50:55.298361 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.297440 2555 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" May 11 20:50:55.298361 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.297508 2555 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" May 11 20:50:55.298361 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.297517 2555 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 11 20:50:55.298361 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.297971 2555 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" May 11 20:50:55.298361 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.298004 2555 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:55.305513 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.305499 2555 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-205.ec2.internal" not found May 11 20:50:55.353272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.353249 2555 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 11 20:50:55.354460 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.354447 2555 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 11 20:50:55.354527 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.354470 2555 status_manager.go:230] "Starting to sync pod status with apiserver" May 11 20:50:55.354527 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.354489 2555 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 11 20:50:55.354527 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.354498 2555 kubelet.go:2451] "Starting kubelet main sync loop" May 11 20:50:55.354656 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.354540 2555 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" May 11 20:50:55.358183 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.358133 2555 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:55.398009 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.397982 2555 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:55.398863 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.398849 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:55.398927 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.398878 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:55.398927 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.398891 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:55.398927 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.398916 2555 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-205.ec2.internal" May 11 20:50:55.408440 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.408420 2555 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-205.ec2.internal" May 11 20:50:55.408489 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.408441 2555 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-205.ec2.internal\": node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:55.425219 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.425196 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:55.455548 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.455489 2555 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal"] May 11 20:50:55.455649 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.455596 2555 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:55.456745 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.456732 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:55.456821 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.456757 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:55.456821 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.456766 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:55.458036 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.458025 2555 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:55.458175 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.458162 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" May 11 20:50:55.458218 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.458191 2555 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:55.458809 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.458794 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:55.458908 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.458818 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:55.458908 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.458798 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:55.458908 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.458847 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:55.458908 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.458863 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:55.458908 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.458826 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:55.459946 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.459932 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal" May 11 20:50:55.460003 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.459958 2555 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:55.460616 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.460594 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:55.460616 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.460617 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:55.460756 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.460626 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:55.485067 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.485044 2555 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-205.ec2.internal\" not found" node="ip-10-0-133-205.ec2.internal" May 11 20:50:55.489434 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.489419 2555 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-205.ec2.internal\" not found" node="ip-10-0-133-205.ec2.internal" May 11 20:50:55.525651 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.525633 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:55.529916 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.529901 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4b279b287d0ec6e644b6187c68744f9b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal\" (UID: \"4b279b287d0ec6e644b6187c68744f9b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" May 11 20:50:55.529983 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.529940 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b279b287d0ec6e644b6187c68744f9b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal\" (UID: \"4b279b287d0ec6e644b6187c68744f9b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" May 11 20:50:55.529983 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.529956 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/09c2e9e5d3d1eb649c292303ae36692a-config\") pod \"kube-apiserver-proxy-ip-10-0-133-205.ec2.internal\" (UID: \"09c2e9e5d3d1eb649c292303ae36692a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal" May 11 20:50:55.626569 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.626477 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:55.630843 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.630825 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4b279b287d0ec6e644b6187c68744f9b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal\" (UID: \"4b279b287d0ec6e644b6187c68744f9b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" May 11 20:50:55.630892 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.630862 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b279b287d0ec6e644b6187c68744f9b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal\" (UID: \"4b279b287d0ec6e644b6187c68744f9b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" May 11 20:50:55.630892 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.630881 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/09c2e9e5d3d1eb649c292303ae36692a-config\") pod \"kube-apiserver-proxy-ip-10-0-133-205.ec2.internal\" (UID: \"09c2e9e5d3d1eb649c292303ae36692a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal" May 11 20:50:55.630960 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.630904 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/09c2e9e5d3d1eb649c292303ae36692a-config\") pod \"kube-apiserver-proxy-ip-10-0-133-205.ec2.internal\" (UID: \"09c2e9e5d3d1eb649c292303ae36692a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal" May 11 20:50:55.630960 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.630830 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4b279b287d0ec6e644b6187c68744f9b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal\" (UID: \"4b279b287d0ec6e644b6187c68744f9b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" May 11 20:50:55.630960 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.630952 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b279b287d0ec6e644b6187c68744f9b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal\" (UID: \"4b279b287d0ec6e644b6187c68744f9b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" May 11 20:50:55.727288 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.727243 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:55.787725 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.787695 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" May 11 20:50:55.791308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:55.791287 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal" May 11 20:50:55.827882 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.827851 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:55.928348 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:55.928263 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:56.028764 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:56.028725 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:56.129333 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:56.129303 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:56.131471 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.131453 2555 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" May 11 20:50:56.131619 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.131604 2555 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" May 11 20:50:56.131662 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.131643 2555 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" May 11 20:50:56.214250 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.214200 2555 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-05-10 20:45:55 +0000 UTC" deadline="2027-11-28 20:04:34.497508306 +0000 UTC" May 11 20:50:56.214250 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.214241 2555 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13583h13m38.283270574s" May 11 20:50:56.227392 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.227362 2555 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" May 11 20:50:56.229992 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:56.229971 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:56.249980 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.249959 2555 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 11 20:50:56.283455 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:56.283399 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09c2e9e5d3d1eb649c292303ae36692a.slice/crio-c485faad4356e1e38f87cc05d669f34f26383aa89cdb95b9267d2ee4852f1837 WatchSource:0}: Error finding container c485faad4356e1e38f87cc05d669f34f26383aa89cdb95b9267d2ee4852f1837: Status 404 returned error can't find the container with id c485faad4356e1e38f87cc05d669f34f26383aa89cdb95b9267d2ee4852f1837 May 11 20:50:56.283674 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:56.283655 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b279b287d0ec6e644b6187c68744f9b.slice/crio-951c8defab8210323860ebed7632ec73335c391170ab2b140ec40f9c88fba459 WatchSource:0}: Error finding container 951c8defab8210323860ebed7632ec73335c391170ab2b140ec40f9c88fba459: Status 404 returned error can't find the container with id 951c8defab8210323860ebed7632ec73335c391170ab2b140ec40f9c88fba459 May 11 20:50:56.288439 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.288420 2555 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 20:50:56.291329 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.291312 2555 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m7knz" May 11 20:50:56.297729 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.297711 2555 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m7knz" May 11 20:50:56.330877 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:56.330843 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:56.358101 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.358026 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" event={"ID":"4b279b287d0ec6e644b6187c68744f9b","Type":"ContainerStarted","Data":"951c8defab8210323860ebed7632ec73335c391170ab2b140ec40f9c88fba459"} May 11 20:50:56.359011 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.358990 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal" event={"ID":"09c2e9e5d3d1eb649c292303ae36692a","Type":"ContainerStarted","Data":"c485faad4356e1e38f87cc05d669f34f26383aa89cdb95b9267d2ee4852f1837"} May 11 20:50:56.431423 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:56.431367 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:56.531925 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:56.531850 2555 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-205.ec2.internal\" not found" May 11 20:50:56.543588 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.543554 2555 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:56.628768 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.628727 2555 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal" May 11 20:50:56.636236 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.636218 2555 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 11 20:50:56.638208 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.638195 2555 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" May 11 20:50:56.648737 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.648721 2555 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 11 20:50:56.723022 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:56.722994 2555 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:57.118786 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.118752 2555 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:57.151578 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.151531 2555 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:57.203909 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.203877 2555 apiserver.go:52] "Watching apiserver" May 11 20:50:57.212460 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.212438 2555 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" May 11 20:50:57.213472 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.213442 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qvlmw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal","openshift-multus/multus-additional-cni-plugins-mkbtt","openshift-multus/network-metrics-daemon-fq6hx","openshift-network-diagnostics/network-check-target-6z6rl","openshift-ovn-kubernetes/ovnkube-node-gkzk7","kube-system/konnectivity-agent-5szqz","openshift-cluster-node-tuning-operator/tuned-5nvxl","openshift-dns/node-resolver-sgkpq","openshift-multus/multus-4hgt9","openshift-network-operator/iptables-alerter-gsb8f","kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz"] May 11 20:50:57.214839 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.214813 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5szqz" May 11 20:50:57.215855 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.215833 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.216896 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.216877 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:50:57.217579 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.217381 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:50:57.217672 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.217603 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" May 11 20:50:57.217672 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.217610 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9lrst\"" May 11 20:50:57.217761 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.217677 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" May 11 20:50:57.218274 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.218250 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2hqlf\"" May 11 20:50:57.218377 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.218288 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:50:57.218377 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.218350 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:50:57.218548 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.218499 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" May 11 20:50:57.218654 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.218637 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" May 11 20:50:57.218709 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.218672 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" May 11 20:50:57.218776 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.218757 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" May 11 20:50:57.219008 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.218993 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" May 11 20:50:57.220815 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.220794 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.222014 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.221969 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.222991 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.222970 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" May 11 20:50:57.223143 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.223060 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2dzzq\"" May 11 20:50:57.223418 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.223384 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.223509 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.223441 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" May 11 20:50:57.223650 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.223636 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" May 11 20:50:57.224078 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.224056 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" May 11 20:50:57.224164 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.224107 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" May 11 20:50:57.224221 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.224056 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" May 11 20:50:57.224320 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.224301 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" May 11 20:50:57.224669 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.224652 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wr8jp\"" May 11 20:50:57.224755 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.224667 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" May 11 20:50:57.224755 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.224696 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" May 11 20:50:57.225071 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.225054 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.227029 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.226325 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4hgt9" May 11 20:50:57.227029 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.226705 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:57.227029 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.226910 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qwj2j\"" May 11 20:50:57.227029 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.226912 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" May 11 20:50:57.227593 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.227573 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" May 11 20:50:57.227698 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.227682 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.228048 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.228032 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" May 11 20:50:57.228243 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.228229 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qvmvl\"" May 11 20:50:57.228687 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.228673 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" May 11 20:50:57.229068 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.228931 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pvrw2\"" May 11 20:50:57.229068 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.229029 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.230126 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.230110 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" May 11 20:50:57.230203 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.230128 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:57.230203 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.230143 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kwdtn\"" May 11 20:50:57.230353 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.230339 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" May 11 20:50:57.231117 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.231098 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" May 11 20:50:57.231536 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.231522 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" May 11 20:50:57.231628 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.231540 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" May 11 20:50:57.231628 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.231594 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qjbs4\"" May 11 20:50:57.238800 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.238775 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-systemd\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.238904 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.238812 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-os-release\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.238904 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.238849 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.238904 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.238872 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5a8d1588-0bb4-436d-88d7-4920b143287d-ovnkube-config\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.239063 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.238924 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-run-netns\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.239063 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.238955 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-log-socket\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.239063 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239024 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-cni-bin\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.239063 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239054 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-cni-binary-copy\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.239252 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239076 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-socket-dir-parent\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.239252 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239100 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-run-ovn\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.239252 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239124 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38fd286b-fbc6-4171-8c30-db06a4c25fe9-tmp-dir\") pod \"node-resolver-sgkpq\" (UID: \"38fd286b-fbc6-4171-8c30-db06a4c25fe9\") " pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.239252 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239145 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-var-lib-cni-bin\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.239252 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239175 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvfg7\" (UniqueName: \"kubernetes.io/projected/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-kube-api-access-xvfg7\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.239252 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239203 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:50:57.239252 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239223 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-sysctl-conf\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239251 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-var-lib-kubelet\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239287 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5z59\" (UniqueName: \"kubernetes.io/projected/cbcc7ce7-89e0-427e-a68a-792f371dcc93-kube-api-access-b5z59\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239309 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d27adc4-933c-4d24-bd8a-bc40d4f26e8c-agent-certs\") pod \"konnectivity-agent-5szqz\" (UID: \"7d27adc4-933c-4d24-bd8a-bc40d4f26e8c\") " pod="kube-system/konnectivity-agent-5szqz" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239327 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-daemon-config\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239351 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-modprobe-d\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239377 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-run\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239396 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-host\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239425 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a8d1588-0bb4-436d-88d7-4920b143287d-ovn-node-metrics-cert\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239447 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-systemd-units\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239470 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78krl\" (UniqueName: \"kubernetes.io/projected/38fd286b-fbc6-4171-8c30-db06a4c25fe9-kube-api-access-78krl\") pod \"node-resolver-sgkpq\" (UID: \"38fd286b-fbc6-4171-8c30-db06a4c25fe9\") " pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239496 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-etc-openvswitch\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239519 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69h7j\" (UniqueName: \"kubernetes.io/projected/bfdfde03-4872-4d6b-a541-c904d768028c-kube-api-access-69h7j\") pod \"node-ca-qvlmw\" (UID: \"bfdfde03-4872-4d6b-a541-c904d768028c\") " pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239543 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-system-cni-dir\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.239598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239577 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-cni-dir\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239617 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbcc7ce7-89e0-427e-a68a-792f371dcc93-tmp\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239648 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239721 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5a8d1588-0bb4-436d-88d7-4920b143287d-ovnkube-script-lib\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239741 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38fd286b-fbc6-4171-8c30-db06a4c25fe9-hosts-file\") pod \"node-resolver-sgkpq\" (UID: \"38fd286b-fbc6-4171-8c30-db06a4c25fe9\") " pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239781 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-run-k8s-cni-cncf-io\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239806 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf9t5\" (UniqueName: \"kubernetes.io/projected/5a8d1588-0bb4-436d-88d7-4920b143287d-kube-api-access-qf9t5\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239828 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7khcd\" (UniqueName: \"kubernetes.io/projected/a81b49ee-ae3d-49d0-a312-73d9f1541c8d-kube-api-access-7khcd\") pod \"iptables-alerter-gsb8f\" (UID: \"a81b49ee-ae3d-49d0-a312-73d9f1541c8d\") " pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239868 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rch4\" (UniqueName: \"kubernetes.io/projected/692ffb95-b8bb-4e21-9e37-a9bad55c11be-kube-api-access-8rch4\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239901 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-sysctl-d\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239914 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjjb5\" (UniqueName: \"kubernetes.io/projected/b4925a5a-de00-49a9-8175-8f69c30f6825-kube-api-access-sjjb5\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239930 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a81b49ee-ae3d-49d0-a312-73d9f1541c8d-iptables-alerter-script\") pod \"iptables-alerter-gsb8f\" (UID: \"a81b49ee-ae3d-49d0-a312-73d9f1541c8d\") " pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239944 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-cni-binary-copy\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.239973 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bfdfde03-4872-4d6b-a541-c904d768028c-serviceca\") pod \"node-ca-qvlmw\" (UID: \"bfdfde03-4872-4d6b-a541-c904d768028c\") " pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240013 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-sysconfig\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240050 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.240268 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240086 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5a8d1588-0bb4-436d-88d7-4920b143287d-env-overrides\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240113 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240147 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-node-log\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240174 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-etc-kubernetes\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240201 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-system-cni-dir\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240244 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-kubernetes\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240273 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-lib-modules\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240309 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d27adc4-933c-4d24-bd8a-bc40d4f26e8c-konnectivity-ca\") pod \"konnectivity-agent-5szqz\" (UID: \"7d27adc4-933c-4d24-bd8a-bc40d4f26e8c\") " pod="kube-system/konnectivity-agent-5szqz" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240326 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-slash\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240341 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-cni-netd\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240361 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfdfde03-4872-4d6b-a541-c904d768028c-host\") pod \"node-ca-qvlmw\" (UID: \"bfdfde03-4872-4d6b-a541-c904d768028c\") " pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240383 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-os-release\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240454 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240524 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-sys\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240555 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-registration-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240600 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-device-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.241080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240628 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-run-systemd\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240667 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-var-lib-openvswitch\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240695 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-run-openvswitch\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240720 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-run-ovn-kubernetes\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240790 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-var-lib-kubelet\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240826 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-conf-dir\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240853 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-kubelet\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240876 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-cnibin\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240900 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-run-netns\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240922 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-hostroot\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240943 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-tuned\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240964 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-socket-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.240986 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-etc-selinux\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.241007 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-sys-fs\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.241032 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-var-lib-cni-multus\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.241058 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a81b49ee-ae3d-49d0-a312-73d9f1541c8d-host-slash\") pod \"iptables-alerter-gsb8f\" (UID: \"a81b49ee-ae3d-49d0-a312-73d9f1541c8d\") " pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.241804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.241084 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.242588 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.241108 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-run-multus-certs\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.242588 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.241142 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pn48\" (UniqueName: \"kubernetes.io/projected/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-kube-api-access-8pn48\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.242588 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.241171 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-cnibin\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.299055 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.299022 2555 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-10 20:45:56 +0000 UTC" deadline="2028-02-02 19:31:48.048075882 +0000 UTC" May 11 20:50:57.299055 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.299048 2555 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15166h40m50.749029966s" May 11 20:50:57.329329 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.329298 2555 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 11 20:50:57.341372 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341338 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a81b49ee-ae3d-49d0-a312-73d9f1541c8d-host-slash\") pod \"iptables-alerter-gsb8f\" (UID: \"a81b49ee-ae3d-49d0-a312-73d9f1541c8d\") " pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.341372 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341370 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341395 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-run-multus-certs\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341478 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a81b49ee-ae3d-49d0-a312-73d9f1541c8d-host-slash\") pod \"iptables-alerter-gsb8f\" (UID: \"a81b49ee-ae3d-49d0-a312-73d9f1541c8d\") " pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341528 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pn48\" (UniqueName: \"kubernetes.io/projected/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-kube-api-access-8pn48\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341560 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-cnibin\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341562 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341581 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-systemd\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341596 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-run-multus-certs\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341605 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-os-release\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341620 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-cnibin\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.341638 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341620 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341652 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5a8d1588-0bb4-436d-88d7-4920b143287d-ovnkube-config\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341675 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-run-netns\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341697 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-log-socket\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341723 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-cni-bin\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341744 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-cni-binary-copy\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341759 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-socket-dir-parent\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341773 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-run-ovn\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341787 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38fd286b-fbc6-4171-8c30-db06a4c25fe9-tmp-dir\") pod \"node-resolver-sgkpq\" (UID: \"38fd286b-fbc6-4171-8c30-db06a4c25fe9\") " pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341803 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-var-lib-cni-bin\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341820 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfg7\" (UniqueName: \"kubernetes.io/projected/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-kube-api-access-xvfg7\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341839 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341864 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-sysctl-conf\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341886 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-var-lib-kubelet\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341946 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5z59\" (UniqueName: \"kubernetes.io/projected/cbcc7ce7-89e0-427e-a68a-792f371dcc93-kube-api-access-b5z59\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341963 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d27adc4-933c-4d24-bd8a-bc40d4f26e8c-agent-certs\") pod \"konnectivity-agent-5szqz\" (UID: \"7d27adc4-933c-4d24-bd8a-bc40d4f26e8c\") " pod="kube-system/konnectivity-agent-5szqz" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.341980 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-daemon-config\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.342067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342005 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-modprobe-d\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342026 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-run\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342048 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-host\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342065 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a8d1588-0bb4-436d-88d7-4920b143287d-ovn-node-metrics-cert\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342080 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-systemd-units\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342135 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342142 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-systemd\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342197 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-systemd-units\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342232 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78krl\" (UniqueName: \"kubernetes.io/projected/38fd286b-fbc6-4171-8c30-db06a4c25fe9-kube-api-access-78krl\") pod \"node-resolver-sgkpq\" (UID: \"38fd286b-fbc6-4171-8c30-db06a4c25fe9\") " pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342244 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-os-release\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342262 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-etc-openvswitch\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342288 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69h7j\" (UniqueName: \"kubernetes.io/projected/bfdfde03-4872-4d6b-a541-c904d768028c-kube-api-access-69h7j\") pod \"node-ca-qvlmw\" (UID: \"bfdfde03-4872-4d6b-a541-c904d768028c\") " pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342315 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-system-cni-dir\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342342 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-cni-dir\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342364 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbcc7ce7-89e0-427e-a68a-792f371dcc93-tmp\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.342372 2555 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342390 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.342901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342448 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5a8d1588-0bb4-436d-88d7-4920b143287d-ovnkube-script-lib\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.342494 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs podName:692ffb95-b8bb-4e21-9e37-a9bad55c11be nodeName:}" failed. No retries permitted until 2026-05-11 20:50:57.842453624 +0000 UTC m=+3.054396751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs") pod "network-metrics-daemon-fq6hx" (UID: "692ffb95-b8bb-4e21-9e37-a9bad55c11be") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342516 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38fd286b-fbc6-4171-8c30-db06a4c25fe9-hosts-file\") pod \"node-resolver-sgkpq\" (UID: \"38fd286b-fbc6-4171-8c30-db06a4c25fe9\") " pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342534 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-run-k8s-cni-cncf-io\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342565 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf9t5\" (UniqueName: \"kubernetes.io/projected/5a8d1588-0bb4-436d-88d7-4920b143287d-kube-api-access-qf9t5\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342592 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7khcd\" (UniqueName: \"kubernetes.io/projected/a81b49ee-ae3d-49d0-a312-73d9f1541c8d-kube-api-access-7khcd\") pod \"iptables-alerter-gsb8f\" (UID: \"a81b49ee-ae3d-49d0-a312-73d9f1541c8d\") " pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342615 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rch4\" (UniqueName: \"kubernetes.io/projected/692ffb95-b8bb-4e21-9e37-a9bad55c11be-kube-api-access-8rch4\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342637 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-sysctl-d\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342664 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjjb5\" (UniqueName: \"kubernetes.io/projected/b4925a5a-de00-49a9-8175-8f69c30f6825-kube-api-access-sjjb5\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342690 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a81b49ee-ae3d-49d0-a312-73d9f1541c8d-iptables-alerter-script\") pod \"iptables-alerter-gsb8f\" (UID: \"a81b49ee-ae3d-49d0-a312-73d9f1541c8d\") " pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342708 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-cni-binary-copy\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342738 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bfdfde03-4872-4d6b-a541-c904d768028c-serviceca\") pod \"node-ca-qvlmw\" (UID: \"bfdfde03-4872-4d6b-a541-c904d768028c\") " pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342757 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-sysconfig\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342775 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342792 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5a8d1588-0bb4-436d-88d7-4920b143287d-env-overrides\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342809 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:50:57.343749 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342826 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-node-log\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342848 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-etc-kubernetes\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342864 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-system-cni-dir\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342880 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-kubernetes\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342911 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-lib-modules\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342931 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d27adc4-933c-4d24-bd8a-bc40d4f26e8c-konnectivity-ca\") pod \"konnectivity-agent-5szqz\" (UID: \"7d27adc4-933c-4d24-bd8a-bc40d4f26e8c\") " pod="kube-system/konnectivity-agent-5szqz" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342951 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-slash\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342970 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-cni-netd\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.342990 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfdfde03-4872-4d6b-a541-c904d768028c-host\") pod \"node-ca-qvlmw\" (UID: \"bfdfde03-4872-4d6b-a541-c904d768028c\") " pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343015 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-os-release\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343033 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343050 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-sys\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343047 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5a8d1588-0bb4-436d-88d7-4920b143287d-ovnkube-script-lib\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343066 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-registration-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343093 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-device-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343111 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-run-systemd\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343110 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5a8d1588-0bb4-436d-88d7-4920b143287d-ovnkube-config\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.344514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343130 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-var-lib-openvswitch\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343165 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-var-lib-openvswitch\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343167 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-run-openvswitch\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343198 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-run-ovn-kubernetes\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343227 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-var-lib-kubelet\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343253 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-conf-dir\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343281 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-sysctl-conf\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343282 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-kubelet\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343321 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-cnibin\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343322 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-run-netns\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343356 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-cnibin\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343354 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-run-netns\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343382 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-hostroot\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343387 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-run-netns\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343400 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-tuned\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343447 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-socket-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343470 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-etc-selinux\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343470 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-log-socket\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.345295 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343490 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-sys-fs\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343495 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343515 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-cni-bin\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343518 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-var-lib-cni-multus\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343569 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-var-lib-cni-multus\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343543 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-kubelet\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343622 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-node-log\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343651 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-etc-kubernetes\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343675 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-system-cni-dir\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343716 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-kubernetes\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343707 2555 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343784 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-lib-modules\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343933 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5a8d1588-0bb4-436d-88d7-4920b143287d-env-overrides\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.343998 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-var-lib-kubelet\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344029 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-cni-binary-copy\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344216 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d27adc4-933c-4d24-bd8a-bc40d4f26e8c-konnectivity-ca\") pod \"konnectivity-agent-5szqz\" (UID: \"7d27adc4-933c-4d24-bd8a-bc40d4f26e8c\") " pod="kube-system/konnectivity-agent-5szqz" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344256 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-slash\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344292 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-cni-netd\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.346213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344322 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfdfde03-4872-4d6b-a541-c904d768028c-host\") pod \"node-ca-qvlmw\" (UID: \"bfdfde03-4872-4d6b-a541-c904d768028c\") " pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344360 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-socket-dir-parent\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344366 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-os-release\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344454 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-run-ovn\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344602 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-var-lib-kubelet\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344678 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-run-ovn-kubernetes\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344733 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-etc-openvswitch\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344702 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-var-lib-cni-bin\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344759 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38fd286b-fbc6-4171-8c30-db06a4c25fe9-tmp-dir\") pod \"node-resolver-sgkpq\" (UID: \"38fd286b-fbc6-4171-8c30-db06a4c25fe9\") " pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344780 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-run-openvswitch\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344800 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344822 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-conf-dir\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344816 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-system-cni-dir\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344838 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-hostroot\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344873 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-device-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344889 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-sys\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344898 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-run-systemd\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344890 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-registration-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.347272 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344930 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-host\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.344953 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-cni-dir\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345129 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-modprobe-d\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345188 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-run\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345530 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a81b49ee-ae3d-49d0-a312-73d9f1541c8d-iptables-alerter-script\") pod \"iptables-alerter-gsb8f\" (UID: \"a81b49ee-ae3d-49d0-a312-73d9f1541c8d\") " pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345553 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-sysconfig\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345570 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-multus-daemon-config\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345607 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8d1588-0bb4-436d-88d7-4920b143287d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345641 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38fd286b-fbc6-4171-8c30-db06a4c25fe9-hosts-file\") pod \"node-resolver-sgkpq\" (UID: \"38fd286b-fbc6-4171-8c30-db06a4c25fe9\") " pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345652 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bfdfde03-4872-4d6b-a541-c904d768028c-serviceca\") pod \"node-ca-qvlmw\" (UID: \"bfdfde03-4872-4d6b-a541-c904d768028c\") " pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345939 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-sysctl-d\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345971 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-etc-selinux\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.345996 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-host-run-k8s-cni-cncf-io\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.346001 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-cni-binary-copy\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.346052 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-sys-fs\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.346065 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4925a5a-de00-49a9-8175-8f69c30f6825-socket-dir\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.346959 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbcc7ce7-89e0-427e-a68a-792f371dcc93-tmp\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.348095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.347835 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a8d1588-0bb4-436d-88d7-4920b143287d-ovn-node-metrics-cert\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.349072 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.348361 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d27adc4-933c-4d24-bd8a-bc40d4f26e8c-agent-certs\") pod \"konnectivity-agent-5szqz\" (UID: \"7d27adc4-933c-4d24-bd8a-bc40d4f26e8c\") " pod="kube-system/konnectivity-agent-5szqz" May 11 20:50:57.349982 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.349962 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:57.350082 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.349985 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:57.350082 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.349997 2555 projected.go:194] Error preparing data for projected volume kube-api-access-qhs75 for pod openshift-network-diagnostics/network-check-target-6z6rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:57.350082 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.350065 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75 podName:ff9e0b72-a1d2-4476-a4d4-db6a3425a266 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:57.85004868 +0000 UTC m=+3.061991834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qhs75" (UniqueName: "kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75") pod "network-check-target-6z6rl" (UID: "ff9e0b72-a1d2-4476-a4d4-db6a3425a266") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:57.350868 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.350805 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pn48\" (UniqueName: \"kubernetes.io/projected/93766b7c-6f9e-4bb2-a35e-9104fc3059f6-kube-api-access-8pn48\") pod \"multus-4hgt9\" (UID: \"93766b7c-6f9e-4bb2-a35e-9104fc3059f6\") " pod="openshift-multus/multus-4hgt9" May 11 20:50:57.351546 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.351511 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78krl\" (UniqueName: \"kubernetes.io/projected/38fd286b-fbc6-4171-8c30-db06a4c25fe9-kube-api-access-78krl\") pod \"node-resolver-sgkpq\" (UID: \"38fd286b-fbc6-4171-8c30-db06a4c25fe9\") " pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.351654 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.351548 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvfg7\" (UniqueName: \"kubernetes.io/projected/fa6aa941-a7a4-40a4-82c1-046fa5c671d1-kube-api-access-xvfg7\") pod \"multus-additional-cni-plugins-mkbtt\" (UID: \"fa6aa941-a7a4-40a4-82c1-046fa5c671d1\") " pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.351747 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.351727 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69h7j\" (UniqueName: \"kubernetes.io/projected/bfdfde03-4872-4d6b-a541-c904d768028c-kube-api-access-69h7j\") pod \"node-ca-qvlmw\" (UID: \"bfdfde03-4872-4d6b-a541-c904d768028c\") " pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.352377 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.352356 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5z59\" (UniqueName: \"kubernetes.io/projected/cbcc7ce7-89e0-427e-a68a-792f371dcc93-kube-api-access-b5z59\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.354371 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.354352 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7khcd\" (UniqueName: \"kubernetes.io/projected/a81b49ee-ae3d-49d0-a312-73d9f1541c8d-kube-api-access-7khcd\") pod \"iptables-alerter-gsb8f\" (UID: \"a81b49ee-ae3d-49d0-a312-73d9f1541c8d\") " pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.354534 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.354356 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf9t5\" (UniqueName: \"kubernetes.io/projected/5a8d1588-0bb4-436d-88d7-4920b143287d-kube-api-access-qf9t5\") pod \"ovnkube-node-gkzk7\" (UID: \"5a8d1588-0bb4-436d-88d7-4920b143287d\") " pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.356938 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.356918 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjjb5\" (UniqueName: \"kubernetes.io/projected/b4925a5a-de00-49a9-8175-8f69c30f6825-kube-api-access-sjjb5\") pod \"aws-ebs-csi-driver-node-b9qvz\" (UID: \"b4925a5a-de00-49a9-8175-8f69c30f6825\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.357381 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.357361 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rch4\" (UniqueName: \"kubernetes.io/projected/692ffb95-b8bb-4e21-9e37-a9bad55c11be-kube-api-access-8rch4\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:50:57.357618 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.357599 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cbcc7ce7-89e0-427e-a68a-792f371dcc93-etc-tuned\") pod \"tuned-5nvxl\" (UID: \"cbcc7ce7-89e0-427e-a68a-792f371dcc93\") " pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.529536 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.529449 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5szqz" May 11 20:50:57.537258 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.537230 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qvlmw" May 11 20:50:57.544889 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.544868 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" May 11 20:50:57.550206 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.550185 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:50:57.555813 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.555787 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" May 11 20:50:57.562259 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.562247 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sgkpq" May 11 20:50:57.568063 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.567827 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4hgt9" May 11 20:50:57.575439 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.575421 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gsb8f" May 11 20:50:57.579940 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.579926 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" May 11 20:50:57.847122 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.847029 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:50:57.847267 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.847193 2555 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:57.847267 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.847260 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs podName:692ffb95-b8bb-4e21-9e37-a9bad55c11be nodeName:}" failed. No retries permitted until 2026-05-11 20:50:58.847240364 +0000 UTC m=+4.059183493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs") pod "network-metrics-daemon-fq6hx" (UID: "692ffb95-b8bb-4e21-9e37-a9bad55c11be") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:57.906772 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:57.906604 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a8d1588_0bb4_436d_88d7_4920b143287d.slice/crio-c9053913c12ed626df8594e80fbc1e3b17e1ac95d902641851a0a5f6d85f306d WatchSource:0}: Error finding container c9053913c12ed626df8594e80fbc1e3b17e1ac95d902641851a0a5f6d85f306d: Status 404 returned error can't find the container with id c9053913c12ed626df8594e80fbc1e3b17e1ac95d902641851a0a5f6d85f306d May 11 20:50:57.907529 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:57.907361 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbcc7ce7_89e0_427e_a68a_792f371dcc93.slice/crio-31a01b5a854d885bc9dd305b5d1501c45b849265e36662de8311004e5e0d61e0 WatchSource:0}: Error finding container 31a01b5a854d885bc9dd305b5d1501c45b849265e36662de8311004e5e0d61e0: Status 404 returned error can't find the container with id 31a01b5a854d885bc9dd305b5d1501c45b849265e36662de8311004e5e0d61e0 May 11 20:50:57.909064 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:57.909040 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d27adc4_933c_4d24_bd8a_bc40d4f26e8c.slice/crio-2b06c54b4f5d6c9a5552eb946cda429c1787e4de361c67e5596e1c939c3da416 WatchSource:0}: Error finding container 2b06c54b4f5d6c9a5552eb946cda429c1787e4de361c67e5596e1c939c3da416: Status 404 returned error can't find the container with id 2b06c54b4f5d6c9a5552eb946cda429c1787e4de361c67e5596e1c939c3da416 May 11 20:50:57.913890 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:57.913755 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfdfde03_4872_4d6b_a541_c904d768028c.slice/crio-cd2d6685cb0c826568e903062ac29e4024f403d8510207167a29221f00a90761 WatchSource:0}: Error finding container cd2d6685cb0c826568e903062ac29e4024f403d8510207167a29221f00a90761: Status 404 returned error can't find the container with id cd2d6685cb0c826568e903062ac29e4024f403d8510207167a29221f00a90761 May 11 20:50:57.915073 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:57.914948 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6aa941_a7a4_40a4_82c1_046fa5c671d1.slice/crio-cd9372a8c46a73df37bc277f5a9dd9faa98020e21abae6a564f0300f082cf81d WatchSource:0}: Error finding container cd9372a8c46a73df37bc277f5a9dd9faa98020e21abae6a564f0300f082cf81d: Status 404 returned error can't find the container with id cd9372a8c46a73df37bc277f5a9dd9faa98020e21abae6a564f0300f082cf81d May 11 20:50:57.915397 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:57.915373 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38fd286b_fbc6_4171_8c30_db06a4c25fe9.slice/crio-de01463db13fd8a2bd9b01e24dc88ed05cd706d2b841f2d925e48fbe69a3b718 WatchSource:0}: Error finding container de01463db13fd8a2bd9b01e24dc88ed05cd706d2b841f2d925e48fbe69a3b718: Status 404 returned error can't find the container with id de01463db13fd8a2bd9b01e24dc88ed05cd706d2b841f2d925e48fbe69a3b718 May 11 20:50:57.918609 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:57.918582 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda81b49ee_ae3d_49d0_a312_73d9f1541c8d.slice/crio-8848cbc6660b42f5df5e020b6afc0d8733c1c9ccbeaa87ca6306e49535cce22e WatchSource:0}: Error finding container 8848cbc6660b42f5df5e020b6afc0d8733c1c9ccbeaa87ca6306e49535cce22e: Status 404 returned error can't find the container with id 8848cbc6660b42f5df5e020b6afc0d8733c1c9ccbeaa87ca6306e49535cce22e May 11 20:50:57.919918 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:50:57.919894 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93766b7c_6f9e_4bb2_a35e_9104fc3059f6.slice/crio-050584fdf035b9762f67b14c24e4c3c5de00e858d113543da53a8f75280518e1 WatchSource:0}: Error finding container 050584fdf035b9762f67b14c24e4c3c5de00e858d113543da53a8f75280518e1: Status 404 returned error can't find the container with id 050584fdf035b9762f67b14c24e4c3c5de00e858d113543da53a8f75280518e1 May 11 20:50:57.947851 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:57.947825 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:50:57.947967 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.947958 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:57.948030 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.947973 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:57.948030 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.947981 2555 projected.go:194] Error preparing data for projected volume kube-api-access-qhs75 for pod openshift-network-diagnostics/network-check-target-6z6rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:57.948030 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:57.948028 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75 podName:ff9e0b72-a1d2-4476-a4d4-db6a3425a266 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:58.948015339 +0000 UTC m=+4.159958466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhs75" (UniqueName: "kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75") pod "network-check-target-6z6rl" (UID: "ff9e0b72-a1d2-4476-a4d4-db6a3425a266") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:58.300211 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.300104 2555 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-10 20:45:56 +0000 UTC" deadline="2027-11-30 18:12:58.612993181 +0000 UTC" May 11 20:50:58.300211 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.300143 2555 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13629h22m0.312854013s" May 11 20:50:58.355665 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.355117 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:50:58.355665 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:58.355265 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:50:58.366755 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.366712 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gsb8f" event={"ID":"a81b49ee-ae3d-49d0-a312-73d9f1541c8d","Type":"ContainerStarted","Data":"8848cbc6660b42f5df5e020b6afc0d8733c1c9ccbeaa87ca6306e49535cce22e"} May 11 20:50:58.369932 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.369807 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" event={"ID":"fa6aa941-a7a4-40a4-82c1-046fa5c671d1","Type":"ContainerStarted","Data":"cd9372a8c46a73df37bc277f5a9dd9faa98020e21abae6a564f0300f082cf81d"} May 11 20:50:58.376468 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.376430 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" event={"ID":"cbcc7ce7-89e0-427e-a68a-792f371dcc93","Type":"ContainerStarted","Data":"31a01b5a854d885bc9dd305b5d1501c45b849265e36662de8311004e5e0d61e0"} May 11 20:50:58.388985 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.388901 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal" event={"ID":"09c2e9e5d3d1eb649c292303ae36692a","Type":"ContainerStarted","Data":"1b6550a66b6ab85ee7143fd677efd0606818e1457b309cd7605b60b73566ed3d"} May 11 20:50:58.393953 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.393896 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4hgt9" event={"ID":"93766b7c-6f9e-4bb2-a35e-9104fc3059f6","Type":"ContainerStarted","Data":"050584fdf035b9762f67b14c24e4c3c5de00e858d113543da53a8f75280518e1"} May 11 20:50:58.398756 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.398729 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qvlmw" event={"ID":"bfdfde03-4872-4d6b-a541-c904d768028c","Type":"ContainerStarted","Data":"cd2d6685cb0c826568e903062ac29e4024f403d8510207167a29221f00a90761"} May 11 20:50:58.400265 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.400241 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sgkpq" event={"ID":"38fd286b-fbc6-4171-8c30-db06a4c25fe9","Type":"ContainerStarted","Data":"de01463db13fd8a2bd9b01e24dc88ed05cd706d2b841f2d925e48fbe69a3b718"} May 11 20:50:58.402551 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.402523 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" event={"ID":"b4925a5a-de00-49a9-8175-8f69c30f6825","Type":"ContainerStarted","Data":"175e09203a0af3bba630ac49e592a856bac39cb816297b52b7d4d9a7f2e45ca4"} May 11 20:50:58.405974 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.405952 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5szqz" event={"ID":"7d27adc4-933c-4d24-bd8a-bc40d4f26e8c","Type":"ContainerStarted","Data":"2b06c54b4f5d6c9a5552eb946cda429c1787e4de361c67e5596e1c939c3da416"} May 11 20:50:58.408946 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.408925 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerStarted","Data":"c9053913c12ed626df8594e80fbc1e3b17e1ac95d902641851a0a5f6d85f306d"} May 11 20:50:58.413388 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.413366 2555 generic.go:358] "Generic (PLEG): container finished" podID="4b279b287d0ec6e644b6187c68744f9b" containerID="5ce04bec8d56f116c2d711b640df5f8303e57209811ac25ca9cd2bfe4349e3c0" exitCode=0 May 11 20:50:58.413498 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.413399 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" event={"ID":"4b279b287d0ec6e644b6187c68744f9b","Type":"ContainerDied","Data":"5ce04bec8d56f116c2d711b640df5f8303e57209811ac25ca9cd2bfe4349e3c0"} May 11 20:50:58.427911 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.427866 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-205.ec2.internal" podStartSLOduration=2.427848792 podStartE2EDuration="2.427848792s" podCreationTimestamp="2026-05-11 20:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:50:58.404561722 +0000 UTC m=+3.616504872" watchObservedRunningTime="2026-05-11 20:50:58.427848792 +0000 UTC m=+3.639791938" May 11 20:50:58.855963 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.855872 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:50:58.856155 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:58.856063 2555 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:58.856155 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:58.856126 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs podName:692ffb95-b8bb-4e21-9e37-a9bad55c11be nodeName:}" failed. No retries permitted until 2026-05-11 20:51:00.856108511 +0000 UTC m=+6.068051643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs") pod "network-metrics-daemon-fq6hx" (UID: "692ffb95-b8bb-4e21-9e37-a9bad55c11be") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:58.957931 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:58.957218 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:50:58.957931 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:58.957398 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:58.957931 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:58.957432 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:58.957931 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:58.957445 2555 projected.go:194] Error preparing data for projected volume kube-api-access-qhs75 for pod openshift-network-diagnostics/network-check-target-6z6rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:58.957931 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:58.957507 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75 podName:ff9e0b72-a1d2-4476-a4d4-db6a3425a266 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:00.957487353 +0000 UTC m=+6.169430499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhs75" (UniqueName: "kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75") pod "network-check-target-6z6rl" (UID: "ff9e0b72-a1d2-4476-a4d4-db6a3425a266") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:59.356038 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:59.355479 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:50:59.356038 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:50:59.355623 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:50:59.456160 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:50:59.456128 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" event={"ID":"4b279b287d0ec6e644b6187c68744f9b","Type":"ContainerStarted","Data":"4a7c7e8c78847c42c789e52407af6d7912a56bacd760211c67424a937d50610f"} May 11 20:51:00.354999 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:00.354965 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:00.355189 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:00.355109 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:00.874455 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:00.874395 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:00.874848 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:00.874596 2555 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:51:00.874848 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:00.874668 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs podName:692ffb95-b8bb-4e21-9e37-a9bad55c11be nodeName:}" failed. No retries permitted until 2026-05-11 20:51:04.874647132 +0000 UTC m=+10.086590284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs") pod "network-metrics-daemon-fq6hx" (UID: "692ffb95-b8bb-4e21-9e37-a9bad55c11be") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:51:00.975116 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:00.975085 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:00.975290 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:00.975222 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:51:00.975290 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:00.975271 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:51:00.975290 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:00.975287 2555 projected.go:194] Error preparing data for projected volume kube-api-access-qhs75 for pod openshift-network-diagnostics/network-check-target-6z6rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:51:00.975474 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:00.975377 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75 podName:ff9e0b72-a1d2-4476-a4d4-db6a3425a266 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:04.975358002 +0000 UTC m=+10.187301151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhs75" (UniqueName: "kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75") pod "network-check-target-6z6rl" (UID: "ff9e0b72-a1d2-4476-a4d4-db6a3425a266") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:51:01.358083 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:01.358003 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:01.358247 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:01.358145 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:02.355334 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:02.355301 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:02.355770 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:02.355460 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:03.358439 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:03.358331 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:03.358896 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:03.358494 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:04.355328 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:04.355289 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:04.355534 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:04.355459 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:04.907698 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:04.907657 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:04.908074 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:04.907828 2555 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:51:04.908074 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:04.907889 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs podName:692ffb95-b8bb-4e21-9e37-a9bad55c11be nodeName:}" failed. No retries permitted until 2026-05-11 20:51:12.907869882 +0000 UTC m=+18.119813014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs") pod "network-metrics-daemon-fq6hx" (UID: "692ffb95-b8bb-4e21-9e37-a9bad55c11be") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:51:05.008812 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:05.008763 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:05.008991 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:05.008930 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:51:05.008991 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:05.008950 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:51:05.008991 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:05.008962 2555 projected.go:194] Error preparing data for projected volume kube-api-access-qhs75 for pod openshift-network-diagnostics/network-check-target-6z6rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:51:05.009157 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:05.009025 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75 podName:ff9e0b72-a1d2-4476-a4d4-db6a3425a266 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:13.009005909 +0000 UTC m=+18.220949039 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhs75" (UniqueName: "kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75") pod "network-check-target-6z6rl" (UID: "ff9e0b72-a1d2-4476-a4d4-db6a3425a266") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:51:05.357943 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:05.357869 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:05.358094 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:05.357974 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:06.355649 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:06.355614 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:06.356211 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:06.355763 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:07.355292 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:07.355256 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:07.355485 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:07.355388 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:08.355374 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:08.355287 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:08.355788 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:08.355429 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:09.355615 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:09.355581 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:09.356031 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:09.355699 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:10.355120 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:10.355083 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:10.355303 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:10.355203 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:11.355244 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:11.355210 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:11.355702 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:11.355332 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:12.355039 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:12.355002 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:12.355196 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:12.355131 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:12.969362 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:12.969320 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:12.969769 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:12.969435 2555 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:51:12.969769 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:12.969509 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs podName:692ffb95-b8bb-4e21-9e37-a9bad55c11be nodeName:}" failed. No retries permitted until 2026-05-11 20:51:28.969487705 +0000 UTC m=+34.181430851 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs") pod "network-metrics-daemon-fq6hx" (UID: "692ffb95-b8bb-4e21-9e37-a9bad55c11be") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:51:13.070427 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:13.070366 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:13.070627 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:13.070570 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:51:13.070627 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:13.070595 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:51:13.070627 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:13.070607 2555 projected.go:194] Error preparing data for projected volume kube-api-access-qhs75 for pod openshift-network-diagnostics/network-check-target-6z6rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:51:13.070784 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:13.070674 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75 podName:ff9e0b72-a1d2-4476-a4d4-db6a3425a266 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:29.070653366 +0000 UTC m=+34.282596508 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhs75" (UniqueName: "kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75") pod "network-check-target-6z6rl" (UID: "ff9e0b72-a1d2-4476-a4d4-db6a3425a266") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:51:13.355461 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:13.355373 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:13.355619 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:13.355524 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:14.355069 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:14.355037 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:14.355484 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:14.355155 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:15.355863 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:15.355830 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:15.356307 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:15.355926 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:16.355477 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.355296 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:16.355613 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:16.355590 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:16.490153 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.490120 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" event={"ID":"cbcc7ce7-89e0-427e-a68a-792f371dcc93","Type":"ContainerStarted","Data":"f4690b784a060b7984a534fcf7fb7c8978286f36c44a0a6217a7dfba969b87f6"} May 11 20:51:16.491498 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.491476 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4hgt9" event={"ID":"93766b7c-6f9e-4bb2-a35e-9104fc3059f6","Type":"ContainerStarted","Data":"4edd36b2d312af71764192fadddcf3cd87c7141d9ff293988d7861e8472d1b11"} May 11 20:51:16.492777 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.492748 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qvlmw" event={"ID":"bfdfde03-4872-4d6b-a541-c904d768028c","Type":"ContainerStarted","Data":"e0165f0c5842d2f9f082b6d6daa3b9cd76fc83ce36238e446a8ff2925ee5a1df"} May 11 20:51:16.494049 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.494017 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sgkpq" event={"ID":"38fd286b-fbc6-4171-8c30-db06a4c25fe9","Type":"ContainerStarted","Data":"800ca562fbb8fc6e5dd3089e110e89abcbae6ef06f21ee8f7560932460ba9856"} May 11 20:51:16.495527 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.495504 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" event={"ID":"b4925a5a-de00-49a9-8175-8f69c30f6825","Type":"ContainerStarted","Data":"3559f15b676a379d049d53d3407cf77581d009a55d5101d02982201c31eb7aec"} May 11 20:51:16.496804 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.496764 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5szqz" event={"ID":"7d27adc4-933c-4d24-bd8a-bc40d4f26e8c","Type":"ContainerStarted","Data":"9d31f09c5c498021d3a9151efed7e2570eaa1591326e64339d70d064abb27e69"} May 11 20:51:16.500089 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.500073 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 20:51:16.500362 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.500342 2555 generic.go:358] "Generic (PLEG): container finished" podID="5a8d1588-0bb4-436d-88d7-4920b143287d" containerID="7ca9ea52beafba4e75a81fc55a5e9c696ed53f818ac61e7169565b16c1ba1781" exitCode=1 May 11 20:51:16.500482 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.500426 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerStarted","Data":"5b0fc8393befe1ccf30f7cbb176f1fc31961c71af05e5e36d71a97097fef2614"} May 11 20:51:16.500482 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.500453 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerDied","Data":"7ca9ea52beafba4e75a81fc55a5e9c696ed53f818ac61e7169565b16c1ba1781"} May 11 20:51:16.500482 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.500466 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerStarted","Data":"7a62309039a7da8fca516dbedeeefc549c70afcc03b61df6a1c214faff3491d8"} May 11 20:51:16.501654 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.501635 2555 generic.go:358] "Generic (PLEG): container finished" podID="fa6aa941-a7a4-40a4-82c1-046fa5c671d1" containerID="56ce21ccbd986539d2bcb9d79426525a88532dc5014520938ad1680bf20e1a2e" exitCode=0 May 11 20:51:16.501738 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.501666 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" event={"ID":"fa6aa941-a7a4-40a4-82c1-046fa5c671d1","Type":"ContainerDied","Data":"56ce21ccbd986539d2bcb9d79426525a88532dc5014520938ad1680bf20e1a2e"} May 11 20:51:16.506581 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.506538 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-205.ec2.internal" podStartSLOduration=20.506528151 podStartE2EDuration="20.506528151s" podCreationTimestamp="2026-05-11 20:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:50:59.472582669 +0000 UTC m=+4.684525820" watchObservedRunningTime="2026-05-11 20:51:16.506528151 +0000 UTC m=+21.718471300" May 11 20:51:16.507027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.506995 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5nvxl" podStartSLOduration=3.6954993480000002 podStartE2EDuration="21.506987928s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:57.909124881 +0000 UTC m=+3.121068008" lastFinishedPulling="2026-05-11 20:51:15.720613447 +0000 UTC m=+20.932556588" observedRunningTime="2026-05-11 20:51:16.506582427 +0000 UTC m=+21.718525578" watchObservedRunningTime="2026-05-11 20:51:16.506987928 +0000 UTC m=+21.718931082" May 11 20:51:16.527496 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.527425 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qvlmw" podStartSLOduration=3.743819725 podStartE2EDuration="21.527387709s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:57.916037602 +0000 UTC m=+3.127980729" lastFinishedPulling="2026-05-11 20:51:15.699605585 +0000 UTC m=+20.911548713" observedRunningTime="2026-05-11 20:51:16.526966758 +0000 UTC m=+21.738909919" watchObservedRunningTime="2026-05-11 20:51:16.527387709 +0000 UTC m=+21.739330858" May 11 20:51:16.569458 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.567173 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4hgt9" podStartSLOduration=3.76340491 podStartE2EDuration="21.567154407s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:57.921846064 +0000 UTC m=+3.133789192" lastFinishedPulling="2026-05-11 20:51:15.725595561 +0000 UTC m=+20.937538689" observedRunningTime="2026-05-11 20:51:16.543567901 +0000 UTC m=+21.755511055" watchObservedRunningTime="2026-05-11 20:51:16.567154407 +0000 UTC m=+21.779097559" May 11 20:51:16.581174 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.581136 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sgkpq" podStartSLOduration=3.77622641 podStartE2EDuration="21.581121645s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:57.92031574 +0000 UTC m=+3.132258867" lastFinishedPulling="2026-05-11 20:51:15.725210971 +0000 UTC m=+20.937154102" observedRunningTime="2026-05-11 20:51:16.580817424 +0000 UTC m=+21.792760588" watchObservedRunningTime="2026-05-11 20:51:16.581121645 +0000 UTC m=+21.793064796" May 11 20:51:16.595247 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:16.595211 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5szqz" podStartSLOduration=3.806623046 podStartE2EDuration="21.595197994s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:57.910991689 +0000 UTC m=+3.122934816" lastFinishedPulling="2026-05-11 20:51:15.699566623 +0000 UTC m=+20.911509764" observedRunningTime="2026-05-11 20:51:16.595124963 +0000 UTC m=+21.807068113" watchObservedRunningTime="2026-05-11 20:51:16.595197994 +0000 UTC m=+21.807141143" May 11 20:51:17.355387 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:17.355364 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:17.355569 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:17.355504 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:17.495742 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:17.495551 2555 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" May 11 20:51:17.505281 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:17.505253 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" event={"ID":"b4925a5a-de00-49a9-8175-8f69c30f6825","Type":"ContainerStarted","Data":"cd7db37faacb94bfdfffa0605ef20c97862f35c48cf86d32130f6cb4cf5fe333"} May 11 20:51:17.508003 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:17.507983 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 20:51:17.508366 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:17.508345 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerStarted","Data":"fc2764800d4193a518b4e065dad7c361ee0206b195c853bf664f71ac40596d09"} May 11 20:51:17.508495 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:17.508374 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerStarted","Data":"04011d2cb04d84fc150fdea28fb010befe9f5a0b762956ef7c977afa13670a3d"} May 11 20:51:17.508495 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:17.508384 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerStarted","Data":"9c911148633a00c1d0e678a9fe231acfd46177ce1bfcc1a4b6fb63e0eeb66314"} May 11 20:51:17.509764 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:17.509737 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gsb8f" event={"ID":"a81b49ee-ae3d-49d0-a312-73d9f1541c8d","Type":"ContainerStarted","Data":"79bccbd1c7d08836dc025ef6078a7ec2ab20b781fa6cb664cfb44383c3eeca8c"} May 11 20:51:17.524932 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:17.524895 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gsb8f" podStartSLOduration=4.746478151 podStartE2EDuration="22.524882186s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:57.921161323 +0000 UTC m=+3.133104464" lastFinishedPulling="2026-05-11 20:51:15.699565356 +0000 UTC m=+20.911508499" observedRunningTime="2026-05-11 20:51:17.524386469 +0000 UTC m=+22.736329618" watchObservedRunningTime="2026-05-11 20:51:17.524882186 +0000 UTC m=+22.736825336" May 11 20:51:18.315158 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:18.315044 2555 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-05-11T20:51:17.495709911Z","UUID":"57631795-0bef-4acd-b0b2-ab394d04a917","Handler":null,"Name":"","Endpoint":""} May 11 20:51:18.316729 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:18.316699 2555 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 May 11 20:51:18.316729 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:18.316734 2555 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock May 11 20:51:18.355332 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:18.355181 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:18.355332 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:18.355298 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:18.727178 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:18.727137 2555 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5szqz" May 11 20:51:18.727897 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:18.727872 2555 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5szqz" May 11 20:51:19.357841 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:19.357812 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:19.358003 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:19.357931 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:19.515709 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:19.515675 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" event={"ID":"b4925a5a-de00-49a9-8175-8f69c30f6825","Type":"ContainerStarted","Data":"d7af1b48ad550912546c10f13ed615d2c2c9a1a537a14a7b168469a458fa9d55"} May 11 20:51:19.518657 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:19.518627 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 20:51:19.519148 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:19.519128 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerStarted","Data":"f5e9230d343bbc52a39901933cbd6ab3a421a8d9afb87adeda9be1663765dfe8"} May 11 20:51:19.519468 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:19.519449 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5szqz" May 11 20:51:19.519832 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:19.519813 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5szqz" May 11 20:51:19.530892 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:19.530848 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9qvz" podStartSLOduration=3.935180833 podStartE2EDuration="24.530836734s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:57.915138232 +0000 UTC m=+3.127081360" lastFinishedPulling="2026-05-11 20:51:18.510794131 +0000 UTC m=+23.722737261" observedRunningTime="2026-05-11 20:51:19.530822701 +0000 UTC m=+24.742765850" watchObservedRunningTime="2026-05-11 20:51:19.530836734 +0000 UTC m=+24.742779883" May 11 20:51:20.355425 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:20.355378 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:20.356010 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:20.355518 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:21.355444 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:21.355330 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:21.356027 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:21.355454 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:21.525529 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:21.525494 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 20:51:21.525882 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:21.525858 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerStarted","Data":"a32f395fdf4a7b7407eb8ef67eb5f54e1bf2554634263d5c4bb344a3a608116c"} May 11 20:51:21.526116 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:21.526092 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:51:21.526196 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:21.526118 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:51:21.526309 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:21.526291 2555 scope.go:117] "RemoveContainer" containerID="7ca9ea52beafba4e75a81fc55a5e9c696ed53f818ac61e7169565b16c1ba1781" May 11 20:51:21.527563 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:21.527539 2555 generic.go:358] "Generic (PLEG): container finished" podID="fa6aa941-a7a4-40a4-82c1-046fa5c671d1" containerID="6668cbeafebef8cd4b586811aafc06ad6c6b6fee3eba3bd956861c8285c54335" exitCode=0 May 11 20:51:21.527659 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:21.527623 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" event={"ID":"fa6aa941-a7a4-40a4-82c1-046fa5c671d1","Type":"ContainerDied","Data":"6668cbeafebef8cd4b586811aafc06ad6c6b6fee3eba3bd956861c8285c54335"} May 11 20:51:21.544703 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:21.544681 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:51:22.355605 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:22.355573 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:22.356006 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:22.355683 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:22.532420 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:22.532385 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 20:51:22.532746 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:22.532723 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" event={"ID":"5a8d1588-0bb4-436d-88d7-4920b143287d","Type":"ContainerStarted","Data":"bc3c56d7622004d8615a2175fef25b07b2589d98c4650849710f346fdd5de542"} May 11 20:51:22.532982 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:22.532967 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:51:22.548623 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:22.548596 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:51:22.562763 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:22.562722 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" podStartSLOduration=9.371637134 podStartE2EDuration="27.562708068s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:57.908887997 +0000 UTC m=+3.120831131" lastFinishedPulling="2026-05-11 20:51:16.099958921 +0000 UTC m=+21.311902065" observedRunningTime="2026-05-11 20:51:22.56102638 +0000 UTC m=+27.772969541" watchObservedRunningTime="2026-05-11 20:51:22.562708068 +0000 UTC m=+27.774651218" May 11 20:51:23.355187 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:23.355146 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:23.355354 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:23.355249 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:23.536491 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:23.536452 2555 generic.go:358] "Generic (PLEG): container finished" podID="fa6aa941-a7a4-40a4-82c1-046fa5c671d1" containerID="db47c1a1d1a684f1d0a4b9c01ebaccd3938cfc9727f75f00b2d9da439ee32a0e" exitCode=0 May 11 20:51:23.536868 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:23.536545 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" event={"ID":"fa6aa941-a7a4-40a4-82c1-046fa5c671d1","Type":"ContainerDied","Data":"db47c1a1d1a684f1d0a4b9c01ebaccd3938cfc9727f75f00b2d9da439ee32a0e"} May 11 20:51:24.355808 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:24.355303 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:24.355808 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:24.355470 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:25.305300 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:25.305121 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6z6rl"] May 11 20:51:25.305849 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:25.305372 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:25.305849 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:25.305481 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:25.309181 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:25.309150 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fq6hx"] May 11 20:51:25.309312 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:25.309249 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:25.309380 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:25.309357 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:25.542831 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:25.542749 2555 generic.go:358] "Generic (PLEG): container finished" podID="fa6aa941-a7a4-40a4-82c1-046fa5c671d1" containerID="ad1d7eef887456e5fffb92f81e2332b77382e93dd89c2f83ac76f20e0b124be1" exitCode=0 May 11 20:51:25.542831 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:25.542796 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" event={"ID":"fa6aa941-a7a4-40a4-82c1-046fa5c671d1","Type":"ContainerDied","Data":"ad1d7eef887456e5fffb92f81e2332b77382e93dd89c2f83ac76f20e0b124be1"} May 11 20:51:27.355097 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:27.355059 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:27.355097 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:27.355088 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:27.355773 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:27.355207 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fq6hx" podUID="692ffb95-b8bb-4e21-9e37-a9bad55c11be" May 11 20:51:27.355773 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:27.355608 2555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6z6rl" podUID="ff9e0b72-a1d2-4476-a4d4-db6a3425a266" May 11 20:51:28.591967 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.591926 2555 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-205.ec2.internal" event="NodeReady" May 11 20:51:28.592467 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.592102 2555 kubelet_node_status.go:550] "Fast updating node status as it just became ready" May 11 20:51:28.640785 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.640181 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f79n6"] May 11 20:51:28.661191 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.661161 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7lkng"] May 11 20:51:28.661372 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.661323 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.664146 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.664123 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wsq67\"" May 11 20:51:28.664269 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.664241 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" May 11 20:51:28.664337 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.664263 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" May 11 20:51:28.679943 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.679919 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7lkng"] May 11 20:51:28.679943 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.679949 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f79n6"] May 11 20:51:28.680152 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.679962 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xkrjf"] May 11 20:51:28.680152 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.680075 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7lkng" May 11 20:51:28.682926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.682904 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qhtb5\"" May 11 20:51:28.683190 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.683174 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" May 11 20:51:28.683515 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.683499 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" May 11 20:51:28.683599 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.683551 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" May 11 20:51:28.696163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.696144 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xkrjf"] May 11 20:51:28.696299 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.696287 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.699090 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.698898 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s999c\"" May 11 20:51:28.699090 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.698922 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" May 11 20:51:28.699090 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.698923 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" May 11 20:51:28.699090 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.698983 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" May 11 20:51:28.699090 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.698922 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781644 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4119420-0b77-4749-8a7f-0a8814c65f64-data-volume\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781699 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903aee12-cd22-4c02-ba34-3a042f47b6b5-cert\") pod \"ingress-canary-7lkng\" (UID: \"903aee12-cd22-4c02-ba34-3a042f47b6b5\") " pod="openshift-ingress-canary/ingress-canary-7lkng" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781727 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-metrics-tls\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781742 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-tmp-dir\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781758 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkns9\" (UniqueName: \"kubernetes.io/projected/a4119420-0b77-4749-8a7f-0a8814c65f64-kube-api-access-xkns9\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781802 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-config-volume\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781842 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4119420-0b77-4749-8a7f-0a8814c65f64-crio-socket\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781876 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4119420-0b77-4749-8a7f-0a8814c65f64-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781905 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4119420-0b77-4749-8a7f-0a8814c65f64-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.781957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781931 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g525l\" (UniqueName: \"kubernetes.io/projected/903aee12-cd22-4c02-ba34-3a042f47b6b5-kube-api-access-g525l\") pod \"ingress-canary-7lkng\" (UID: \"903aee12-cd22-4c02-ba34-3a042f47b6b5\") " pod="openshift-ingress-canary/ingress-canary-7lkng" May 11 20:51:28.782595 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.781980 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzclm\" (UniqueName: \"kubernetes.io/projected/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-kube-api-access-rzclm\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.882524 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882459 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903aee12-cd22-4c02-ba34-3a042f47b6b5-cert\") pod \"ingress-canary-7lkng\" (UID: \"903aee12-cd22-4c02-ba34-3a042f47b6b5\") " pod="openshift-ingress-canary/ingress-canary-7lkng" May 11 20:51:28.882717 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882529 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-metrics-tls\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.882717 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882560 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-tmp-dir\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.882717 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882583 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkns9\" (UniqueName: \"kubernetes.io/projected/a4119420-0b77-4749-8a7f-0a8814c65f64-kube-api-access-xkns9\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.882717 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882636 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-config-volume\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.882717 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882661 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4119420-0b77-4749-8a7f-0a8814c65f64-crio-socket\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.882717 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882681 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4119420-0b77-4749-8a7f-0a8814c65f64-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.882717 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882700 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4119420-0b77-4749-8a7f-0a8814c65f64-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.883023 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882726 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g525l\" (UniqueName: \"kubernetes.io/projected/903aee12-cd22-4c02-ba34-3a042f47b6b5-kube-api-access-g525l\") pod \"ingress-canary-7lkng\" (UID: \"903aee12-cd22-4c02-ba34-3a042f47b6b5\") " pod="openshift-ingress-canary/ingress-canary-7lkng" May 11 20:51:28.883023 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.882760 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzclm\" (UniqueName: \"kubernetes.io/projected/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-kube-api-access-rzclm\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.883242 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.883217 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4119420-0b77-4749-8a7f-0a8814c65f64-data-volume\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.883362 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.883343 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4119420-0b77-4749-8a7f-0a8814c65f64-crio-socket\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.883466 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.883446 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-tmp-dir\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.883612 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.883581 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4119420-0b77-4749-8a7f-0a8814c65f64-data-volume\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.884021 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.884002 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4119420-0b77-4749-8a7f-0a8814c65f64-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.884184 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.884163 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-config-volume\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.887001 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.886980 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4119420-0b77-4749-8a7f-0a8814c65f64-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.887088 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.887061 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-metrics-tls\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.888010 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.887984 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903aee12-cd22-4c02-ba34-3a042f47b6b5-cert\") pod \"ingress-canary-7lkng\" (UID: \"903aee12-cd22-4c02-ba34-3a042f47b6b5\") " pod="openshift-ingress-canary/ingress-canary-7lkng" May 11 20:51:28.891319 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.891297 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g525l\" (UniqueName: \"kubernetes.io/projected/903aee12-cd22-4c02-ba34-3a042f47b6b5-kube-api-access-g525l\") pod \"ingress-canary-7lkng\" (UID: \"903aee12-cd22-4c02-ba34-3a042f47b6b5\") " pod="openshift-ingress-canary/ingress-canary-7lkng" May 11 20:51:28.891520 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.891498 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkns9\" (UniqueName: \"kubernetes.io/projected/a4119420-0b77-4749-8a7f-0a8814c65f64-kube-api-access-xkns9\") pod \"insights-runtime-extractor-xkrjf\" (UID: \"a4119420-0b77-4749-8a7f-0a8814c65f64\") " pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:28.902245 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.902221 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzclm\" (UniqueName: \"kubernetes.io/projected/e9eef6b6-ed64-4eb2-aba7-77a3f6213f02-kube-api-access-rzclm\") pod \"dns-default-f79n6\" (UID: \"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02\") " pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.973038 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.972855 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f79n6" May 11 20:51:28.984082 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.984057 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:28.984217 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:28.984196 2555 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:51:28.984283 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:28.984274 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs podName:692ffb95-b8bb-4e21-9e37-a9bad55c11be nodeName:}" failed. No retries permitted until 2026-05-11 20:52:00.98425888 +0000 UTC m=+66.196202013 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs") pod "network-metrics-daemon-fq6hx" (UID: "692ffb95-b8bb-4e21-9e37-a9bad55c11be") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:51:28.990023 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:28.989998 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7lkng" May 11 20:51:29.006881 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:29.006852 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xkrjf" May 11 20:51:29.085130 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:29.084997 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:29.085296 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:29.085153 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:51:29.085296 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:29.085170 2555 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:51:29.085296 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:29.085179 2555 projected.go:194] Error preparing data for projected volume kube-api-access-qhs75 for pod openshift-network-diagnostics/network-check-target-6z6rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:51:29.085296 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:29.085243 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75 podName:ff9e0b72-a1d2-4476-a4d4-db6a3425a266 nodeName:}" failed. No retries permitted until 2026-05-11 20:52:01.085225667 +0000 UTC m=+66.297168813 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhs75" (UniqueName: "kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75") pod "network-check-target-6z6rl" (UID: "ff9e0b72-a1d2-4476-a4d4-db6a3425a266") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:51:29.355691 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:29.355603 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:51:29.355691 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:29.355653 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:51:29.358540 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:29.358513 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" May 11 20:51:29.358691 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:29.358604 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" May 11 20:51:29.358691 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:29.358626 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 11 20:51:29.358803 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:29.358736 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kqkzb\"" May 11 20:51:29.359598 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:29.359576 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mdvls\"" May 11 20:51:31.002613 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.002575 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh"] May 11 20:51:31.007755 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.007712 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" May 11 20:51:31.010839 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.010633 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" May 11 20:51:31.011427 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.011383 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-2jpvm\"" May 11 20:51:31.012539 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.012516 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh"] May 11 20:51:31.099554 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.099330 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5c38a6fa-759b-4f50-a5d6-74ebbfbd6439-tls-certificates\") pod \"prometheus-operator-admission-webhook-64b84d7657-s5hhh\" (UID: \"5c38a6fa-759b-4f50-a5d6-74ebbfbd6439\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" May 11 20:51:31.200708 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.200024 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5c38a6fa-759b-4f50-a5d6-74ebbfbd6439-tls-certificates\") pod \"prometheus-operator-admission-webhook-64b84d7657-s5hhh\" (UID: \"5c38a6fa-759b-4f50-a5d6-74ebbfbd6439\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" May 11 20:51:31.209171 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.209060 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5c38a6fa-759b-4f50-a5d6-74ebbfbd6439-tls-certificates\") pod \"prometheus-operator-admission-webhook-64b84d7657-s5hhh\" (UID: \"5c38a6fa-759b-4f50-a5d6-74ebbfbd6439\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" May 11 20:51:31.261219 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.261190 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7lkng"] May 11 20:51:31.264050 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.264026 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xkrjf"] May 11 20:51:31.275492 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.275469 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f79n6"] May 11 20:51:31.278608 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:31.278579 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod903aee12_cd22_4c02_ba34_3a042f47b6b5.slice/crio-8dcf9357f04a10b5bc7f56ea733c1cb1c6ea4a24fb820414f8391c4c7aea011f WatchSource:0}: Error finding container 8dcf9357f04a10b5bc7f56ea733c1cb1c6ea4a24fb820414f8391c4c7aea011f: Status 404 returned error can't find the container with id 8dcf9357f04a10b5bc7f56ea733c1cb1c6ea4a24fb820414f8391c4c7aea011f May 11 20:51:31.278875 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:31.278855 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4119420_0b77_4749_8a7f_0a8814c65f64.slice/crio-0b3fed650504f8ed15f537225431507a4359b01964234b13faf4ef922ba853e8 WatchSource:0}: Error finding container 0b3fed650504f8ed15f537225431507a4359b01964234b13faf4ef922ba853e8: Status 404 returned error can't find the container with id 0b3fed650504f8ed15f537225431507a4359b01964234b13faf4ef922ba853e8 May 11 20:51:31.279321 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:31.279291 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9eef6b6_ed64_4eb2_aba7_77a3f6213f02.slice/crio-7a8af8f395214d783dc00c5c197f3204a07db068b2e000c39365d9ddc929265a WatchSource:0}: Error finding container 7a8af8f395214d783dc00c5c197f3204a07db068b2e000c39365d9ddc929265a: Status 404 returned error can't find the container with id 7a8af8f395214d783dc00c5c197f3204a07db068b2e000c39365d9ddc929265a May 11 20:51:31.322319 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.322161 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" May 11 20:51:31.458546 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.458517 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh"] May 11 20:51:31.472319 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:31.472285 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c38a6fa_759b_4f50_a5d6_74ebbfbd6439.slice/crio-438b59e76158c77d2d0c5d0deb9529faf90544c3e0baa64b74bf0414730917f6 WatchSource:0}: Error finding container 438b59e76158c77d2d0c5d0deb9529faf90544c3e0baa64b74bf0414730917f6: Status 404 returned error can't find the container with id 438b59e76158c77d2d0c5d0deb9529faf90544c3e0baa64b74bf0414730917f6 May 11 20:51:31.557465 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.557351 2555 generic.go:358] "Generic (PLEG): container finished" podID="fa6aa941-a7a4-40a4-82c1-046fa5c671d1" containerID="d7e6e67c4bec3fec1eb66fd63b283f946c29a4bf4bec0b25cfe60018bc5b56b3" exitCode=0 May 11 20:51:31.557465 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.557442 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" event={"ID":"fa6aa941-a7a4-40a4-82c1-046fa5c671d1","Type":"ContainerDied","Data":"d7e6e67c4bec3fec1eb66fd63b283f946c29a4bf4bec0b25cfe60018bc5b56b3"} May 11 20:51:31.558459 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.558438 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7lkng" event={"ID":"903aee12-cd22-4c02-ba34-3a042f47b6b5","Type":"ContainerStarted","Data":"8dcf9357f04a10b5bc7f56ea733c1cb1c6ea4a24fb820414f8391c4c7aea011f"} May 11 20:51:31.559758 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.559734 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xkrjf" event={"ID":"a4119420-0b77-4749-8a7f-0a8814c65f64","Type":"ContainerStarted","Data":"5ab22feec946feff7fa0e71d6afba63758e2d53da21aa99666d600efa5777de5"} May 11 20:51:31.559873 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.559765 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xkrjf" event={"ID":"a4119420-0b77-4749-8a7f-0a8814c65f64","Type":"ContainerStarted","Data":"0b3fed650504f8ed15f537225431507a4359b01964234b13faf4ef922ba853e8"} May 11 20:51:31.560688 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.560667 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f79n6" event={"ID":"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02","Type":"ContainerStarted","Data":"7a8af8f395214d783dc00c5c197f3204a07db068b2e000c39365d9ddc929265a"} May 11 20:51:31.561692 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:31.561669 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" event={"ID":"5c38a6fa-759b-4f50-a5d6-74ebbfbd6439","Type":"ContainerStarted","Data":"438b59e76158c77d2d0c5d0deb9529faf90544c3e0baa64b74bf0414730917f6"} May 11 20:51:32.566532 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:32.566498 2555 generic.go:358] "Generic (PLEG): container finished" podID="fa6aa941-a7a4-40a4-82c1-046fa5c671d1" containerID="d7da33daed63638f851a28458818d42e6e1eb31e0cd19b5ecb731b188bd9724e" exitCode=0 May 11 20:51:32.567266 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:32.566583 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" event={"ID":"fa6aa941-a7a4-40a4-82c1-046fa5c671d1","Type":"ContainerDied","Data":"d7da33daed63638f851a28458818d42e6e1eb31e0cd19b5ecb731b188bd9724e"} May 11 20:51:32.568613 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:32.568581 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xkrjf" event={"ID":"a4119420-0b77-4749-8a7f-0a8814c65f64","Type":"ContainerStarted","Data":"0f10d43b7876ab1c2b70b20449a63b3ca28afd3044a1b15dc322ab6b1974f60b"} May 11 20:51:34.574652 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.574610 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" event={"ID":"5c38a6fa-759b-4f50-a5d6-74ebbfbd6439","Type":"ContainerStarted","Data":"8ce51f69ae9028363e672617d33e38b792d527db2d74a42b3e05fa1b5d90f050"} May 11 20:51:34.575135 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.574807 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" May 11 20:51:34.578258 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.578231 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" event={"ID":"fa6aa941-a7a4-40a4-82c1-046fa5c671d1","Type":"ContainerStarted","Data":"fcecdbf6960b5bb11c9d471b86914f57881748a6622c84c7043c55343cf9eed7"} May 11 20:51:34.579719 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.579694 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7lkng" event={"ID":"903aee12-cd22-4c02-ba34-3a042f47b6b5","Type":"ContainerStarted","Data":"c9a0a6ef49d9129147c78a32ac6967d1ed7d87c2a3609a360eeb008049f9fc70"} May 11 20:51:34.581019 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.580989 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" May 11 20:51:34.582041 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.582019 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xkrjf" event={"ID":"a4119420-0b77-4749-8a7f-0a8814c65f64","Type":"ContainerStarted","Data":"096e9a5de4f2bcfe005dad2dc2bd0930a3e9af548d928a6bd0fe62126c530ead"} May 11 20:51:34.583450 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.583428 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f79n6" event={"ID":"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02","Type":"ContainerStarted","Data":"acac66e9c895448afd787ef764437f46927174106a8b966860fc1c8461a98274"} May 11 20:51:34.583532 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.583457 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f79n6" event={"ID":"e9eef6b6-ed64-4eb2-aba7-77a3f6213f02","Type":"ContainerStarted","Data":"1943983c99a65a1540707c0ae3869f433eb57ba3257f5933af8c5c2b0bdc2025"} May 11 20:51:34.583575 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.583566 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-f79n6" May 11 20:51:34.593879 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.593841 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-s5hhh" podStartSLOduration=2.054695953 podStartE2EDuration="4.593829912s" podCreationTimestamp="2026-05-11 20:51:30 +0000 UTC" firstStartedPulling="2026-05-11 20:51:31.474293307 +0000 UTC m=+36.686236439" lastFinishedPulling="2026-05-11 20:51:34.013427269 +0000 UTC m=+39.225370398" observedRunningTime="2026-05-11 20:51:34.592947303 +0000 UTC m=+39.804890452" watchObservedRunningTime="2026-05-11 20:51:34.593829912 +0000 UTC m=+39.805773098" May 11 20:51:34.608340 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.608171 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f79n6" podStartSLOduration=3.876761784 podStartE2EDuration="6.608158356s" podCreationTimestamp="2026-05-11 20:51:28 +0000 UTC" firstStartedPulling="2026-05-11 20:51:31.282026982 +0000 UTC m=+36.493970125" lastFinishedPulling="2026-05-11 20:51:34.013423565 +0000 UTC m=+39.225366697" observedRunningTime="2026-05-11 20:51:34.607882662 +0000 UTC m=+39.819825812" watchObservedRunningTime="2026-05-11 20:51:34.608158356 +0000 UTC m=+39.820101505" May 11 20:51:34.624072 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.624028 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xkrjf" podStartSLOduration=3.926003859 podStartE2EDuration="6.624014109s" podCreationTimestamp="2026-05-11 20:51:28 +0000 UTC" firstStartedPulling="2026-05-11 20:51:31.355670229 +0000 UTC m=+36.567613357" lastFinishedPulling="2026-05-11 20:51:34.053680466 +0000 UTC m=+39.265623607" observedRunningTime="2026-05-11 20:51:34.623342489 +0000 UTC m=+39.835285651" watchObservedRunningTime="2026-05-11 20:51:34.624014109 +0000 UTC m=+39.835957258" May 11 20:51:34.659344 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.659289 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mkbtt" podStartSLOduration=6.414319512 podStartE2EDuration="39.659275259s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:57.920151184 +0000 UTC m=+3.132094318" lastFinishedPulling="2026-05-11 20:51:31.165106933 +0000 UTC m=+36.377050065" observedRunningTime="2026-05-11 20:51:34.657817468 +0000 UTC m=+39.869760620" watchObservedRunningTime="2026-05-11 20:51:34.659275259 +0000 UTC m=+39.871218408" May 11 20:51:34.680317 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:34.680266 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7lkng" podStartSLOduration=3.947760829 podStartE2EDuration="6.680250298s" podCreationTimestamp="2026-05-11 20:51:28 +0000 UTC" firstStartedPulling="2026-05-11 20:51:31.280935971 +0000 UTC m=+36.492879113" lastFinishedPulling="2026-05-11 20:51:34.013425453 +0000 UTC m=+39.225368582" observedRunningTime="2026-05-11 20:51:34.679597506 +0000 UTC m=+39.891540667" watchObservedRunningTime="2026-05-11 20:51:34.680250298 +0000 UTC m=+39.892193441" May 11 20:51:35.064583 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.064548 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-94789f4d5-885tt"] May 11 20:51:35.066947 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.066926 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.069696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.069664 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" May 11 20:51:35.069820 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.069723 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" May 11 20:51:35.069820 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.069766 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" May 11 20:51:35.069820 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.069803 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-zrlpg\"" May 11 20:51:35.070482 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.070464 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" May 11 20:51:35.070565 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.070472 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" May 11 20:51:35.074840 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.074815 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-94789f4d5-885tt"] May 11 20:51:35.237771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.237732 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-prometheus-operator-tls\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.237771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.237773 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-metrics-client-ca\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.237976 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.237834 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.237976 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.237903 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbqjn\" (UniqueName: \"kubernetes.io/projected/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-kube-api-access-jbqjn\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.338477 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.338343 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-prometheus-operator-tls\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.338477 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.338385 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-metrics-client-ca\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.338477 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.338466 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.338770 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.338501 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbqjn\" (UniqueName: \"kubernetes.io/projected/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-kube-api-access-jbqjn\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.338770 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:35.338555 2555 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found May 11 20:51:35.338770 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:51:35.338636 2555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-prometheus-operator-tls podName:b1663b6e-7a8e-43d2-92c1-04f006ccbe74 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:35.838615344 +0000 UTC m=+41.050558473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-prometheus-operator-tls") pod "prometheus-operator-94789f4d5-885tt" (UID: "b1663b6e-7a8e-43d2-92c1-04f006ccbe74") : secret "prometheus-operator-tls" not found May 11 20:51:35.339181 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.339160 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-metrics-client-ca\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.342246 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.342216 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.349851 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.349826 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbqjn\" (UniqueName: \"kubernetes.io/projected/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-kube-api-access-jbqjn\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.842806 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.842766 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-prometheus-operator-tls\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.845261 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.845238 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1663b6e-7a8e-43d2-92c1-04f006ccbe74-prometheus-operator-tls\") pod \"prometheus-operator-94789f4d5-885tt\" (UID: \"b1663b6e-7a8e-43d2-92c1-04f006ccbe74\") " pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:35.976777 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:35.976739 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" May 11 20:51:36.104202 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:36.104127 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-94789f4d5-885tt"] May 11 20:51:36.106921 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:36.106898 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1663b6e_7a8e_43d2_92c1_04f006ccbe74.slice/crio-111257797f513ef9f2dbace38a82be6d006d512d5924e1c4513987675204fcbf WatchSource:0}: Error finding container 111257797f513ef9f2dbace38a82be6d006d512d5924e1c4513987675204fcbf: Status 404 returned error can't find the container with id 111257797f513ef9f2dbace38a82be6d006d512d5924e1c4513987675204fcbf May 11 20:51:36.589915 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:36.589879 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" event={"ID":"b1663b6e-7a8e-43d2-92c1-04f006ccbe74","Type":"ContainerStarted","Data":"111257797f513ef9f2dbace38a82be6d006d512d5924e1c4513987675204fcbf"} May 11 20:51:38.598180 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:38.598148 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" event={"ID":"b1663b6e-7a8e-43d2-92c1-04f006ccbe74","Type":"ContainerStarted","Data":"9748edc5ddd203ede82ea32216acabe7925ae2d13096770adb1bf112933cb4c1"} May 11 20:51:38.598180 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:38.598183 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" event={"ID":"b1663b6e-7a8e-43d2-92c1-04f006ccbe74","Type":"ContainerStarted","Data":"81b82c944774f5ddb167aebce18f685b2532454ad6ee46f09bf98d0aac9670d0"} May 11 20:51:38.615154 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:38.615104 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-94789f4d5-885tt" podStartSLOduration=2.046147956 podStartE2EDuration="3.615090431s" podCreationTimestamp="2026-05-11 20:51:35 +0000 UTC" firstStartedPulling="2026-05-11 20:51:36.108846195 +0000 UTC m=+41.320789322" lastFinishedPulling="2026-05-11 20:51:37.677788661 +0000 UTC m=+42.889731797" observedRunningTime="2026-05-11 20:51:38.613886905 +0000 UTC m=+43.825830080" watchObservedRunningTime="2026-05-11 20:51:38.615090431 +0000 UTC m=+43.827033579" May 11 20:51:40.422972 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.422936 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6"] May 11 20:51:40.424937 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.424920 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.427661 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.427639 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-pkqmd\"" May 11 20:51:40.427807 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.427788 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" May 11 20:51:40.427998 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.427983 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" May 11 20:51:40.436085 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.436062 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8"] May 11 20:51:40.437854 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.437840 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.440587 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.440453 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" May 11 20:51:40.441214 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.441193 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4k88x\"" May 11 20:51:40.441299 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.441260 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" May 11 20:51:40.441477 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.441197 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" May 11 20:51:40.443020 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.442477 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6"] May 11 20:51:40.459014 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.458979 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rk44g"] May 11 20:51:40.461624 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.461597 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.461869 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.461845 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8"] May 11 20:51:40.464043 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.464021 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" May 11 20:51:40.464156 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.464074 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" May 11 20:51:40.464216 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.464160 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" May 11 20:51:40.464216 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.464176 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-h27ck\"" May 11 20:51:40.580051 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580010 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4926c338-371e-490e-b158-6a4f75127e87-kube-state-metrics-tls\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.580051 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580051 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85d9b11d-f50f-47a9-9ac6-8338abbe2824-sys\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.580255 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580116 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4926c338-371e-490e-b158-6a4f75127e87-volume-directive-shadow\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.580255 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580150 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-tls\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.580255 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580171 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-accelerators-collector-config\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.580255 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580195 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq6fk\" (UniqueName: \"kubernetes.io/projected/4926c338-371e-490e-b158-6a4f75127e87-kube-api-access-cq6fk\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.580255 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580237 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-wtmp\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.580395 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580275 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8qv\" (UniqueName: \"kubernetes.io/projected/85d9b11d-f50f-47a9-9ac6-8338abbe2824-kube-api-access-tw8qv\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.580395 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580298 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/85d9b11d-f50f-47a9-9ac6-8338abbe2824-root\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.580395 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580360 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4926c338-371e-490e-b158-6a4f75127e87-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.580395 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580382 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7a50aad-97d1-425c-818e-7e0b02d396b7-metrics-client-ca\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.580535 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580398 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85d9b11d-f50f-47a9-9ac6-8338abbe2824-metrics-client-ca\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.580535 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580435 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-textfile\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.580535 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580496 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7a50aad-97d1-425c-818e-7e0b02d396b7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.580616 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580553 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4926c338-371e-490e-b158-6a4f75127e87-metrics-client-ca\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.580616 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580580 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7a50aad-97d1-425c-818e-7e0b02d396b7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.580616 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580599 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flts2\" (UniqueName: \"kubernetes.io/projected/c7a50aad-97d1-425c-818e-7e0b02d396b7-kube-api-access-flts2\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.580704 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580628 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.580704 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.580660 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4926c338-371e-490e-b158-6a4f75127e87-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.682046 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.681961 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4926c338-371e-490e-b158-6a4f75127e87-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.682046 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682001 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7a50aad-97d1-425c-818e-7e0b02d396b7-metrics-client-ca\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.682046 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682021 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85d9b11d-f50f-47a9-9ac6-8338abbe2824-metrics-client-ca\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682046 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682040 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-textfile\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682363 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682180 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7a50aad-97d1-425c-818e-7e0b02d396b7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.682363 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682242 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4926c338-371e-490e-b158-6a4f75127e87-metrics-client-ca\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.682363 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682286 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7a50aad-97d1-425c-818e-7e0b02d396b7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.682363 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682323 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flts2\" (UniqueName: \"kubernetes.io/projected/c7a50aad-97d1-425c-818e-7e0b02d396b7-kube-api-access-flts2\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.682363 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682357 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682629 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682396 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4926c338-371e-490e-b158-6a4f75127e87-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.682629 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682432 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-textfile\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682629 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682453 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4926c338-371e-490e-b158-6a4f75127e87-kube-state-metrics-tls\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.682629 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682479 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85d9b11d-f50f-47a9-9ac6-8338abbe2824-sys\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682629 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682523 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4926c338-371e-490e-b158-6a4f75127e87-volume-directive-shadow\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.682629 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682550 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-tls\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682629 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682576 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-accelerators-collector-config\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682629 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682619 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq6fk\" (UniqueName: \"kubernetes.io/projected/4926c338-371e-490e-b158-6a4f75127e87-kube-api-access-cq6fk\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.682986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682643 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-wtmp\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682676 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8qv\" (UniqueName: \"kubernetes.io/projected/85d9b11d-f50f-47a9-9ac6-8338abbe2824-kube-api-access-tw8qv\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682719 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/85d9b11d-f50f-47a9-9ac6-8338abbe2824-root\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682814 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85d9b11d-f50f-47a9-9ac6-8338abbe2824-metrics-client-ca\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682835 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/85d9b11d-f50f-47a9-9ac6-8338abbe2824-root\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.682986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.682865 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7a50aad-97d1-425c-818e-7e0b02d396b7-metrics-client-ca\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.683349 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.683241 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4926c338-371e-490e-b158-6a4f75127e87-metrics-client-ca\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.683510 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.683479 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4926c338-371e-490e-b158-6a4f75127e87-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.683589 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.683566 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85d9b11d-f50f-47a9-9ac6-8338abbe2824-sys\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.683733 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.683709 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-wtmp\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.683902 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.683877 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-accelerators-collector-config\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.684189 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.684162 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4926c338-371e-490e-b158-6a4f75127e87-volume-directive-shadow\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.685242 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.685197 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.685422 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.685382 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4926c338-371e-490e-b158-6a4f75127e87-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.685490 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.685383 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/85d9b11d-f50f-47a9-9ac6-8338abbe2824-node-exporter-tls\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.685579 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.685564 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7a50aad-97d1-425c-818e-7e0b02d396b7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.686206 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.686185 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7a50aad-97d1-425c-818e-7e0b02d396b7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.686298 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.686278 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4926c338-371e-490e-b158-6a4f75127e87-kube-state-metrics-tls\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.691862 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.691830 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq6fk\" (UniqueName: \"kubernetes.io/projected/4926c338-371e-490e-b158-6a4f75127e87-kube-api-access-cq6fk\") pod \"kube-state-metrics-7764dcf94f-tp4l8\" (UID: \"4926c338-371e-490e-b158-6a4f75127e87\") " pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.692057 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.692036 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flts2\" (UniqueName: \"kubernetes.io/projected/c7a50aad-97d1-425c-818e-7e0b02d396b7-kube-api-access-flts2\") pod \"openshift-state-metrics-5cc99f7c99-w5lm6\" (UID: \"c7a50aad-97d1-425c-818e-7e0b02d396b7\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.692498 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.692477 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8qv\" (UniqueName: \"kubernetes.io/projected/85d9b11d-f50f-47a9-9ac6-8338abbe2824-kube-api-access-tw8qv\") pod \"node-exporter-rk44g\" (UID: \"85d9b11d-f50f-47a9-9ac6-8338abbe2824\") " pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.733696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.733653 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" May 11 20:51:40.749529 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.749496 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" May 11 20:51:40.770369 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.770331 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rk44g" May 11 20:51:40.779545 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:40.779503 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85d9b11d_f50f_47a9_9ac6_8338abbe2824.slice/crio-e433b4e3baae21bce13e217b72bfa807539d0e894bc94ef3189b27f44e938f1c WatchSource:0}: Error finding container e433b4e3baae21bce13e217b72bfa807539d0e894bc94ef3189b27f44e938f1c: Status 404 returned error can't find the container with id e433b4e3baae21bce13e217b72bfa807539d0e894bc94ef3189b27f44e938f1c May 11 20:51:40.871138 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.871109 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6"] May 11 20:51:40.874745 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:40.874715 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7a50aad_97d1_425c_818e_7e0b02d396b7.slice/crio-8179ead4d9aa3fb2a6b637226549032ef497a80a9f5e4b334d98077b3553ec0f WatchSource:0}: Error finding container 8179ead4d9aa3fb2a6b637226549032ef497a80a9f5e4b334d98077b3553ec0f: Status 404 returned error can't find the container with id 8179ead4d9aa3fb2a6b637226549032ef497a80a9f5e4b334d98077b3553ec0f May 11 20:51:40.889931 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:40.889905 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8"] May 11 20:51:40.892972 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:40.892944 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4926c338_371e_490e_b158_6a4f75127e87.slice/crio-959f2a5bb7dbd386025d8b63c772ee9500820372066bfe9f658359aa2e881424 WatchSource:0}: Error finding container 959f2a5bb7dbd386025d8b63c772ee9500820372066bfe9f658359aa2e881424: Status 404 returned error can't find the container with id 959f2a5bb7dbd386025d8b63c772ee9500820372066bfe9f658359aa2e881424 May 11 20:51:41.505860 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.505825 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 11 20:51:41.509315 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.509288 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.512168 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.512145 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" May 11 20:51:41.512475 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.512451 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" May 11 20:51:41.512656 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.512639 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" May 11 20:51:41.513088 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.513073 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" May 11 20:51:41.513949 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.513457 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" May 11 20:51:41.513949 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.513528 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" May 11 20:51:41.513949 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.513709 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" May 11 20:51:41.513949 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.513903 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" May 11 20:51:41.514201 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.514092 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-h54qp\"" May 11 20:51:41.514381 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.514291 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" May 11 20:51:41.522433 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.522384 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 11 20:51:41.608763 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.608723 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" event={"ID":"4926c338-371e-490e-b158-6a4f75127e87","Type":"ContainerStarted","Data":"959f2a5bb7dbd386025d8b63c772ee9500820372066bfe9f658359aa2e881424"} May 11 20:51:41.610190 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.610163 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rk44g" event={"ID":"85d9b11d-f50f-47a9-9ac6-8338abbe2824","Type":"ContainerStarted","Data":"e433b4e3baae21bce13e217b72bfa807539d0e894bc94ef3189b27f44e938f1c"} May 11 20:51:41.613788 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.613766 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" event={"ID":"c7a50aad-97d1-425c-818e-7e0b02d396b7","Type":"ContainerStarted","Data":"00d77c65560c41e7fa669fa37d6d5eeab004882b8c9ec0b7ffd2bd5f7d6411f4"} May 11 20:51:41.613895 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.613791 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" event={"ID":"c7a50aad-97d1-425c-818e-7e0b02d396b7","Type":"ContainerStarted","Data":"62194d86b038263397582899f7aab642660682274e00fae8b11dee69ba61969a"} May 11 20:51:41.613895 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.613802 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" event={"ID":"c7a50aad-97d1-425c-818e-7e0b02d396b7","Type":"ContainerStarted","Data":"8179ead4d9aa3fb2a6b637226549032ef497a80a9f5e4b334d98077b3553ec0f"} May 11 20:51:41.691561 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691525 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691720 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691584 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691720 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691611 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691720 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691636 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691720 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691681 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691720 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691715 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691959 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691738 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-volume\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691959 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691768 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-web-config\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691959 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691805 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691959 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691834 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-out\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.691959 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691858 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.692125 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691957 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.692125 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.691989 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmj86\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-kube-api-access-wmj86\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.793076 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.792981 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-web-config\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.793076 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.793050 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.793307 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.793079 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-out\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.793601 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.793564 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.793725 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.793668 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.793725 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.793699 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmj86\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-kube-api-access-wmj86\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.793854 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.793727 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.793854 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.793777 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.794510 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.793940 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.794510 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.794006 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.794510 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.794051 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.794510 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.794093 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.794510 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.794120 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-volume\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.794510 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.794198 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.794510 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.794198 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.796083 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.796059 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-out\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.796667 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.796643 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.797133 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.797106 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.797381 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.797357 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-web-config\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.797533 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.797512 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-volume\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.797682 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.797665 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.798180 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.798163 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.799213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.799189 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.799458 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.799436 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.799610 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.799585 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.803162 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.803120 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmj86\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-kube-api-access-wmj86\") pod \"alertmanager-main-0\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:41.822174 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:41.822144 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" May 11 20:51:42.192717 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.192673 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 11 20:51:42.318641 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:42.318549 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb273bbf_f5bb_44c6_bcba_41e2cbf4db28.slice/crio-09f207a23f92f0ea89aac2f83680985fc412123af7bf9d0f724f67d1dda35c55 WatchSource:0}: Error finding container 09f207a23f92f0ea89aac2f83680985fc412123af7bf9d0f724f67d1dda35c55: Status 404 returned error can't find the container with id 09f207a23f92f0ea89aac2f83680985fc412123af7bf9d0f724f67d1dda35c55 May 11 20:51:42.391199 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.391167 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-79c5d8458-z59p7"] May 11 20:51:42.397489 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.397466 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.400523 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.400483 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" May 11 20:51:42.400642 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.400525 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" May 11 20:51:42.400642 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.400629 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" May 11 20:51:42.400642 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.400635 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-7hd6b\"" May 11 20:51:42.400792 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.400651 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" May 11 20:51:42.400939 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.400904 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" May 11 20:51:42.400939 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.400922 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-dm78976euqopt\"" May 11 20:51:42.411474 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.411397 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-79c5d8458-z59p7"] May 11 20:51:42.501907 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.501810 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5r54\" (UniqueName: \"kubernetes.io/projected/d3c9d508-0fdf-437d-b335-2fa5981af881-kube-api-access-s5r54\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.501907 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.501847 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.502087 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.501917 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-grpc-tls\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.502087 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.501970 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-tls\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.502087 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.502031 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.502087 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.502073 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.502325 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.502146 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.502325 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.502202 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3c9d508-0fdf-437d-b335-2fa5981af881-metrics-client-ca\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.603189 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.603116 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-grpc-tls\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.603189 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.603162 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-tls\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.603710 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.603198 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.603710 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.603236 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.603710 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.603282 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.603710 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.603324 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3c9d508-0fdf-437d-b335-2fa5981af881-metrics-client-ca\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.603710 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.603376 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5r54\" (UniqueName: \"kubernetes.io/projected/d3c9d508-0fdf-437d-b335-2fa5981af881-kube-api-access-s5r54\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.603710 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.603431 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.604159 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.604131 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3c9d508-0fdf-437d-b335-2fa5981af881-metrics-client-ca\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.606451 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.606324 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-grpc-tls\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.607679 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.607640 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-tls\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.607795 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.607695 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.608784 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.608764 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.608869 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.608808 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.609898 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.609854 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d3c9d508-0fdf-437d-b335-2fa5981af881-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.611737 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.611722 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5r54\" (UniqueName: \"kubernetes.io/projected/d3c9d508-0fdf-437d-b335-2fa5981af881-kube-api-access-s5r54\") pod \"thanos-querier-79c5d8458-z59p7\" (UID: \"d3c9d508-0fdf-437d-b335-2fa5981af881\") " pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.617570 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.617547 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerStarted","Data":"09f207a23f92f0ea89aac2f83680985fc412123af7bf9d0f724f67d1dda35c55"} May 11 20:51:42.619153 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.619131 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" event={"ID":"4926c338-371e-490e-b158-6a4f75127e87","Type":"ContainerStarted","Data":"8fffab275f89ba6adac202bc9a5ea9fdbcc845a618dec91c1a0e1a27ad3690b8"} May 11 20:51:42.619240 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.619159 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" event={"ID":"4926c338-371e-490e-b158-6a4f75127e87","Type":"ContainerStarted","Data":"9004f40d94e0c817c9b957234eee2af1d3651e29993e046f306b3884d6e9e3e5"} May 11 20:51:42.619240 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.619174 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" event={"ID":"4926c338-371e-490e-b158-6a4f75127e87","Type":"ContainerStarted","Data":"d872b491d6348098d5debeee0e9c0a3742be1ec7dc119c867372b737003644e2"} May 11 20:51:42.620647 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.620628 2555 generic.go:358] "Generic (PLEG): container finished" podID="85d9b11d-f50f-47a9-9ac6-8338abbe2824" containerID="33bd5809c3a4675c9d2c9522977b568451dda55495a11538d441e5925702b2ac" exitCode=0 May 11 20:51:42.620730 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.620687 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rk44g" event={"ID":"85d9b11d-f50f-47a9-9ac6-8338abbe2824","Type":"ContainerDied","Data":"33bd5809c3a4675c9d2c9522977b568451dda55495a11538d441e5925702b2ac"} May 11 20:51:42.622671 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.622649 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" event={"ID":"c7a50aad-97d1-425c-818e-7e0b02d396b7","Type":"ContainerStarted","Data":"460cabc8cb4387ee07d5c40cd0adcb3ddcd20e67a9038920f9fdb2b4a1aa4b79"} May 11 20:51:42.637390 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.637350 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7764dcf94f-tp4l8" podStartSLOduration=1.480524068 podStartE2EDuration="2.637334292s" podCreationTimestamp="2026-05-11 20:51:40 +0000 UTC" firstStartedPulling="2026-05-11 20:51:40.894689335 +0000 UTC m=+46.106632462" lastFinishedPulling="2026-05-11 20:51:42.051499555 +0000 UTC m=+47.263442686" observedRunningTime="2026-05-11 20:51:42.636174887 +0000 UTC m=+47.848118038" watchObservedRunningTime="2026-05-11 20:51:42.637334292 +0000 UTC m=+47.849277442" May 11 20:51:42.659189 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.659131 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-w5lm6" podStartSLOduration=1.239686021 podStartE2EDuration="2.659111318s" podCreationTimestamp="2026-05-11 20:51:40 +0000 UTC" firstStartedPulling="2026-05-11 20:51:41.001168902 +0000 UTC m=+46.213112042" lastFinishedPulling="2026-05-11 20:51:42.420594202 +0000 UTC m=+47.632537339" observedRunningTime="2026-05-11 20:51:42.65844933 +0000 UTC m=+47.870392483" watchObservedRunningTime="2026-05-11 20:51:42.659111318 +0000 UTC m=+47.871054468" May 11 20:51:42.707380 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.707353 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:42.864317 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:42.864206 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-79c5d8458-z59p7"] May 11 20:51:42.870286 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:42.870251 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c9d508_0fdf_437d_b335_2fa5981af881.slice/crio-5f9578577a6e80b87ddbea38cda6dff75130c1ee54ba8f46c1cdcefa359c7a58 WatchSource:0}: Error finding container 5f9578577a6e80b87ddbea38cda6dff75130c1ee54ba8f46c1cdcefa359c7a58: Status 404 returned error can't find the container with id 5f9578577a6e80b87ddbea38cda6dff75130c1ee54ba8f46c1cdcefa359c7a58 May 11 20:51:43.628116 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:43.628075 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rk44g" event={"ID":"85d9b11d-f50f-47a9-9ac6-8338abbe2824","Type":"ContainerStarted","Data":"799deeb6519e499aa4e73ba36917d066eee569625b9c4337e56ee49526e9b057"} May 11 20:51:43.628116 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:43.628119 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rk44g" event={"ID":"85d9b11d-f50f-47a9-9ac6-8338abbe2824","Type":"ContainerStarted","Data":"f2aaf7b786a4760fbe5bcb50bb2d781ca200a189ccc7da2d90879045a1de6eeb"} May 11 20:51:43.630684 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:43.630349 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" event={"ID":"d3c9d508-0fdf-437d-b335-2fa5981af881","Type":"ContainerStarted","Data":"5f9578577a6e80b87ddbea38cda6dff75130c1ee54ba8f46c1cdcefa359c7a58"} May 11 20:51:43.649248 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:43.649202 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rk44g" podStartSLOduration=2.387381145 podStartE2EDuration="3.649187583s" podCreationTimestamp="2026-05-11 20:51:40 +0000 UTC" firstStartedPulling="2026-05-11 20:51:40.781266719 +0000 UTC m=+45.993209846" lastFinishedPulling="2026-05-11 20:51:42.043073147 +0000 UTC m=+47.255016284" observedRunningTime="2026-05-11 20:51:43.648327401 +0000 UTC m=+48.860270551" watchObservedRunningTime="2026-05-11 20:51:43.649187583 +0000 UTC m=+48.861130732" May 11 20:51:44.588641 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.588608 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f79n6" May 11 20:51:44.634654 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.634619 2555 generic.go:358] "Generic (PLEG): container finished" podID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerID="7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490" exitCode=0 May 11 20:51:44.635117 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.634710 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerDied","Data":"7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490"} May 11 20:51:44.790485 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.790451 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-56cfb48457-79rdn"] May 11 20:51:44.814425 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.814368 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56cfb48457-79rdn"] May 11 20:51:44.814593 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.814555 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.817645 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.817456 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" May 11 20:51:44.817645 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.817538 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-msq79\"" May 11 20:51:44.817645 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.817539 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" May 11 20:51:44.817645 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.817554 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" May 11 20:51:44.817905 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.817653 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-b3d8fm4or8rnn\"" May 11 20:51:44.817905 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.817860 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" May 11 20:51:44.827304 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.827277 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-secret-metrics-server-client-certs\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.827440 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.827336 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-secret-metrics-server-tls\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.827440 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.827375 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.827576 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.827491 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9m5\" (UniqueName: \"kubernetes.io/projected/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-kube-api-access-5j9m5\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.827576 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.827526 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-client-ca-bundle\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.827669 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.827583 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-audit-log\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.827669 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.827619 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-metrics-server-audit-profiles\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.928308 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.928231 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.928454 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.928335 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9m5\" (UniqueName: \"kubernetes.io/projected/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-kube-api-access-5j9m5\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.928454 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.928369 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-client-ca-bundle\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.928454 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.928399 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-audit-log\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.928454 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.928449 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-metrics-server-audit-profiles\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.928659 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.928493 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-secret-metrics-server-client-certs\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.928659 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.928520 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-secret-metrics-server-tls\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.929088 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.929047 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-audit-log\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.929464 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.929435 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.931307 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.930210 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-metrics-server-audit-profiles\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.933967 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.931987 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-client-ca-bundle\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.934337 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.934315 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-secret-metrics-server-client-certs\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.934439 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.934335 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-secret-metrics-server-tls\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:44.937101 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:44.937081 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9m5\" (UniqueName: \"kubernetes.io/projected/e674bf6f-247c-4cf4-9b96-017b2d3afcd1-kube-api-access-5j9m5\") pod \"metrics-server-56cfb48457-79rdn\" (UID: \"e674bf6f-247c-4cf4-9b96-017b2d3afcd1\") " pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:45.126322 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.126279 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:51:45.204460 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.204379 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns"] May 11 20:51:45.219661 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.219631 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns"] May 11 20:51:45.219827 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.219688 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" May 11 20:51:45.222814 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.222787 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-fqtxg\"" May 11 20:51:45.222926 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.222834 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" May 11 20:51:45.263007 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.262978 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56cfb48457-79rdn"] May 11 20:51:45.332128 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.332100 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e2575669-fb18-4ca8-88ff-2be6e01c5fa9-monitoring-plugin-cert\") pod \"monitoring-plugin-655d88fc6c-6ncns\" (UID: \"e2575669-fb18-4ca8-88ff-2be6e01c5fa9\") " pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" May 11 20:51:45.367582 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:45.367550 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode674bf6f_247c_4cf4_9b96_017b2d3afcd1.slice/crio-77fd8d3fda9d169cb6bef9f3c74a43ac616fa4a866b46e8c88dae80398b06a21 WatchSource:0}: Error finding container 77fd8d3fda9d169cb6bef9f3c74a43ac616fa4a866b46e8c88dae80398b06a21: Status 404 returned error can't find the container with id 77fd8d3fda9d169cb6bef9f3c74a43ac616fa4a866b46e8c88dae80398b06a21 May 11 20:51:45.433229 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.433203 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e2575669-fb18-4ca8-88ff-2be6e01c5fa9-monitoring-plugin-cert\") pod \"monitoring-plugin-655d88fc6c-6ncns\" (UID: \"e2575669-fb18-4ca8-88ff-2be6e01c5fa9\") " pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" May 11 20:51:45.436046 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.436024 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e2575669-fb18-4ca8-88ff-2be6e01c5fa9-monitoring-plugin-cert\") pod \"monitoring-plugin-655d88fc6c-6ncns\" (UID: \"e2575669-fb18-4ca8-88ff-2be6e01c5fa9\") " pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" May 11 20:51:45.532093 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.532063 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" May 11 20:51:45.639933 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.639868 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" event={"ID":"d3c9d508-0fdf-437d-b335-2fa5981af881","Type":"ContainerStarted","Data":"f3872dd8046e6c1fd0349e9e1e5dd0dcb6d6e2e8e58ffe810a3bb54c7c80462f"} May 11 20:51:45.647093 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.647031 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" event={"ID":"e674bf6f-247c-4cf4-9b96-017b2d3afcd1","Type":"ContainerStarted","Data":"77fd8d3fda9d169cb6bef9f3c74a43ac616fa4a866b46e8c88dae80398b06a21"} May 11 20:51:45.685358 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:45.685068 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns"] May 11 20:51:45.686965 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:45.686935 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2575669_fb18_4ca8_88ff_2be6e01c5fa9.slice/crio-0cc7f4513daaf992abb7f0ffa6c8b134189db01ac665a0404a0eefa0ee65dde3 WatchSource:0}: Error finding container 0cc7f4513daaf992abb7f0ffa6c8b134189db01ac665a0404a0eefa0ee65dde3: Status 404 returned error can't find the container with id 0cc7f4513daaf992abb7f0ffa6c8b134189db01ac665a0404a0eefa0ee65dde3 May 11 20:51:46.649426 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.644679 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 11 20:51:46.658961 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.658912 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" event={"ID":"d3c9d508-0fdf-437d-b335-2fa5981af881","Type":"ContainerStarted","Data":"d57067486252e10294d9b9a9a98470c361912d26aa5f885edaecbf64b30fa3cc"} May 11 20:51:46.658961 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.658943 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" event={"ID":"d3c9d508-0fdf-437d-b335-2fa5981af881","Type":"ContainerStarted","Data":"bc534497f7c1ce4adf9618e2049db2daeef0b1b990e0701135b00bc248b4caf3"} May 11 20:51:46.659196 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.659076 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.661282 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.660032 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" event={"ID":"e2575669-fb18-4ca8-88ff-2be6e01c5fa9","Type":"ContainerStarted","Data":"0cc7f4513daaf992abb7f0ffa6c8b134189db01ac665a0404a0eefa0ee65dde3"} May 11 20:51:46.662776 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.662751 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" May 11 20:51:46.663082 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.663037 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" May 11 20:51:46.663328 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.663276 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" May 11 20:51:46.663582 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.663564 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" May 11 20:51:46.663958 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.663938 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" May 11 20:51:46.664146 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.663828 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" May 11 20:51:46.664534 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.664516 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" May 11 20:51:46.665352 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.665291 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" May 11 20:51:46.665571 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.665556 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-f0r19f6ett59e\"" May 11 20:51:46.665754 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.665738 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" May 11 20:51:46.666580 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.665798 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-4gcn8\"" May 11 20:51:46.669089 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.667454 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" May 11 20:51:46.669089 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.668645 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 11 20:51:46.674666 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.674639 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" May 11 20:51:46.675237 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.675216 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744235 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744290 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744334 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-config\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744362 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744387 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744441 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744481 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744506 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744534 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744561 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744601 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744627 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-web-config\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744664 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-config-out\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744684 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744705 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.744906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744726 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf688\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-kube-api-access-mf688\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.745678 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744766 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.745678 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.744788 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.845696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845659 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.845895 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845713 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.845895 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845752 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-config\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.845895 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845778 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.845895 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845805 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.845895 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845834 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.845895 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845875 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.845895 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845893 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845915 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845944 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.845991 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.846018 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-web-config\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.846045 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-config-out\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.846060 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.846074 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.846088 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf688\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-kube-api-access-mf688\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.846116 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.846297 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.846132 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.847979 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.847130 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.847979 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.847702 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.849152 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.849124 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.850843 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.850531 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.850843 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.850805 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.851232 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.851207 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-config\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.853902 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.853882 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.854258 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.854234 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-web-config\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.855120 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.855103 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-config-out\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.855300 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.855279 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.855473 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.855439 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.855604 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.855575 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.855873 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.855616 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.855873 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.855723 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.855873 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.855757 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.856961 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.856934 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.858424 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.858377 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.858734 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.858714 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf688\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-kube-api-access-mf688\") pod \"prometheus-k8s-0\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:46.978951 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:46.978919 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:51:47.620139 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:47.620110 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 11 20:51:47.623116 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:51:47.623084 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41ae7e7_b224_431f_92c6_faef8aeb0669.slice/crio-3c245c7a19899544ffab58f485e5be824e379fa8210ef470aa040065f386c764 WatchSource:0}: Error finding container 3c245c7a19899544ffab58f485e5be824e379fa8210ef470aa040065f386c764: Status 404 returned error can't find the container with id 3c245c7a19899544ffab58f485e5be824e379fa8210ef470aa040065f386c764 May 11 20:51:47.672008 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:47.671975 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" event={"ID":"e674bf6f-247c-4cf4-9b96-017b2d3afcd1","Type":"ContainerStarted","Data":"ca8d3e7db2e8eaca0b613f540af2241d661be4192e3386caf488900d7c8bcc54"} May 11 20:51:47.673851 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:47.673815 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerStarted","Data":"67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c"} May 11 20:51:47.674783 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:47.674768 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerStarted","Data":"3c245c7a19899544ffab58f485e5be824e379fa8210ef470aa040065f386c764"} May 11 20:51:47.712228 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:47.712178 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" podStartSLOduration=1.649913711 podStartE2EDuration="3.712163471s" podCreationTimestamp="2026-05-11 20:51:44 +0000 UTC" firstStartedPulling="2026-05-11 20:51:45.387956139 +0000 UTC m=+50.599899271" lastFinishedPulling="2026-05-11 20:51:47.4502059 +0000 UTC m=+52.662149031" observedRunningTime="2026-05-11 20:51:47.711288583 +0000 UTC m=+52.923231734" watchObservedRunningTime="2026-05-11 20:51:47.712163471 +0000 UTC m=+52.924106621" May 11 20:51:48.683494 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:48.683253 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" event={"ID":"d3c9d508-0fdf-437d-b335-2fa5981af881","Type":"ContainerStarted","Data":"2aebc52b8afb058b03824c7bb527db48bcabf4b6714507932512b4f73525769e"} May 11 20:51:48.683494 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:48.683288 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" event={"ID":"d3c9d508-0fdf-437d-b335-2fa5981af881","Type":"ContainerStarted","Data":"b443dc55ee5994cbdb54ff3e647c534973ab183363b72e75abee9455120cd529"} May 11 20:51:48.686083 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:48.686039 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerStarted","Data":"9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9"} May 11 20:51:48.686083 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:48.686067 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerStarted","Data":"2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23"} May 11 20:51:48.687463 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:48.687437 2555 generic.go:358] "Generic (PLEG): container finished" podID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerID="43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46" exitCode=0 May 11 20:51:48.687588 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:48.687563 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerDied","Data":"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46"} May 11 20:51:49.697719 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.697679 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" event={"ID":"d3c9d508-0fdf-437d-b335-2fa5981af881","Type":"ContainerStarted","Data":"5b7eb68a6781aa305c7d34a2f84429853017d2fcccec9caa43fcdab7ddc5312b"} May 11 20:51:49.698182 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.697918 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:49.701339 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.701307 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerStarted","Data":"df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a"} May 11 20:51:49.701484 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.701347 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerStarted","Data":"3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651"} May 11 20:51:49.701484 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.701361 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerStarted","Data":"6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e"} May 11 20:51:49.703187 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.703152 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" event={"ID":"e2575669-fb18-4ca8-88ff-2be6e01c5fa9","Type":"ContainerStarted","Data":"110fac4336f9d18fb15eab3ebd9558319903729fc80a16d4cb5ec29e336302d3"} May 11 20:51:49.703715 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.703688 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" May 11 20:51:49.709253 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.709232 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" May 11 20:51:49.722449 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.722366 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" podStartSLOduration=2.301980412 podStartE2EDuration="7.722350508s" podCreationTimestamp="2026-05-11 20:51:42 +0000 UTC" firstStartedPulling="2026-05-11 20:51:42.873275121 +0000 UTC m=+48.085218252" lastFinishedPulling="2026-05-11 20:51:48.293645205 +0000 UTC m=+53.505588348" observedRunningTime="2026-05-11 20:51:49.720399618 +0000 UTC m=+54.932342771" watchObservedRunningTime="2026-05-11 20:51:49.722350508 +0000 UTC m=+54.934293660" May 11 20:51:49.749814 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.749766 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.620255151 podStartE2EDuration="8.749744443s" podCreationTimestamp="2026-05-11 20:51:41 +0000 UTC" firstStartedPulling="2026-05-11 20:51:42.320717993 +0000 UTC m=+47.532661121" lastFinishedPulling="2026-05-11 20:51:47.450207281 +0000 UTC m=+52.662150413" observedRunningTime="2026-05-11 20:51:49.747887503 +0000 UTC m=+54.959830690" watchObservedRunningTime="2026-05-11 20:51:49.749744443 +0000 UTC m=+54.961687593" May 11 20:51:49.764471 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:49.764395 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6ncns" podStartSLOduration=1.9297443680000002 podStartE2EDuration="4.764374824s" podCreationTimestamp="2026-05-11 20:51:45 +0000 UTC" firstStartedPulling="2026-05-11 20:51:45.689535932 +0000 UTC m=+50.901479068" lastFinishedPulling="2026-05-11 20:51:48.52416638 +0000 UTC m=+53.736109524" observedRunningTime="2026-05-11 20:51:49.763686328 +0000 UTC m=+54.975629480" watchObservedRunningTime="2026-05-11 20:51:49.764374824 +0000 UTC m=+54.976317976" May 11 20:51:50.713388 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:50.713361 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-79c5d8458-z59p7" May 11 20:51:52.716251 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:52.716164 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerStarted","Data":"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2"} May 11 20:51:52.716251 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:52.716207 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerStarted","Data":"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45"} May 11 20:51:52.716251 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:52.716221 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerStarted","Data":"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7"} May 11 20:51:52.716251 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:52.716233 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerStarted","Data":"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17"} May 11 20:51:52.716251 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:52.716246 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerStarted","Data":"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339"} May 11 20:51:52.716763 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:52.716258 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerStarted","Data":"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea"} May 11 20:51:54.555996 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:54.555965 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gkzk7" May 11 20:51:54.587004 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:54.586935 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.4442641290000005 podStartE2EDuration="8.586915583s" podCreationTimestamp="2026-05-11 20:51:46 +0000 UTC" firstStartedPulling="2026-05-11 20:51:48.688739034 +0000 UTC m=+53.900682175" lastFinishedPulling="2026-05-11 20:51:51.831390494 +0000 UTC m=+57.043333629" observedRunningTime="2026-05-11 20:51:52.765241109 +0000 UTC m=+57.977184284" watchObservedRunningTime="2026-05-11 20:51:54.586915583 +0000 UTC m=+59.798858751" May 11 20:51:56.979031 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:51:56.978998 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:52:01.078220 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.078173 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:52:01.081181 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.081162 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 11 20:52:01.091766 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.091733 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/692ffb95-b8bb-4e21-9e37-a9bad55c11be-metrics-certs\") pod \"network-metrics-daemon-fq6hx\" (UID: \"692ffb95-b8bb-4e21-9e37-a9bad55c11be\") " pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:52:01.177967 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.177932 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mdvls\"" May 11 20:52:01.179200 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.179178 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:52:01.181471 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.181453 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" May 11 20:52:01.185503 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.185482 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fq6hx" May 11 20:52:01.192478 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.192454 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" May 11 20:52:01.203279 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.203255 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhs75\" (UniqueName: \"kubernetes.io/projected/ff9e0b72-a1d2-4476-a4d4-db6a3425a266-kube-api-access-qhs75\") pod \"network-check-target-6z6rl\" (UID: \"ff9e0b72-a1d2-4476-a4d4-db6a3425a266\") " pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:52:01.314942 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.314908 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fq6hx"] May 11 20:52:01.318459 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:52:01.318423 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod692ffb95_b8bb_4e21_9e37_a9bad55c11be.slice/crio-05f75e62cb105772f7505a03bd605028d06a1e33cb7e49259a670a89b9430c4a WatchSource:0}: Error finding container 05f75e62cb105772f7505a03bd605028d06a1e33cb7e49259a670a89b9430c4a: Status 404 returned error can't find the container with id 05f75e62cb105772f7505a03bd605028d06a1e33cb7e49259a670a89b9430c4a May 11 20:52:01.471872 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.471787 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kqkzb\"" May 11 20:52:01.480036 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.480000 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:52:01.607732 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.607701 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6z6rl"] May 11 20:52:01.610602 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:52:01.610578 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff9e0b72_a1d2_4476_a4d4_db6a3425a266.slice/crio-10fc2e51bb127970a5e9a1c13a672c2838eb6d5ac84266130882594d2e4fd999 WatchSource:0}: Error finding container 10fc2e51bb127970a5e9a1c13a672c2838eb6d5ac84266130882594d2e4fd999: Status 404 returned error can't find the container with id 10fc2e51bb127970a5e9a1c13a672c2838eb6d5ac84266130882594d2e4fd999 May 11 20:52:01.746937 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.746901 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6z6rl" event={"ID":"ff9e0b72-a1d2-4476-a4d4-db6a3425a266","Type":"ContainerStarted","Data":"10fc2e51bb127970a5e9a1c13a672c2838eb6d5ac84266130882594d2e4fd999"} May 11 20:52:01.748144 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:01.748121 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fq6hx" event={"ID":"692ffb95-b8bb-4e21-9e37-a9bad55c11be","Type":"ContainerStarted","Data":"05f75e62cb105772f7505a03bd605028d06a1e33cb7e49259a670a89b9430c4a"} May 11 20:52:02.753311 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:02.753268 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fq6hx" event={"ID":"692ffb95-b8bb-4e21-9e37-a9bad55c11be","Type":"ContainerStarted","Data":"cb007374a08cf61fbfb234ff1718a1e65c37fe5631989c0cdbbe63ca16249673"} May 11 20:52:02.753311 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:02.753310 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fq6hx" event={"ID":"692ffb95-b8bb-4e21-9e37-a9bad55c11be","Type":"ContainerStarted","Data":"9fe89813c1d1bce72e2a23f522c49cb97a915aeaeaa5b154dcddc91cefacc9d1"} May 11 20:52:02.770342 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:02.770279 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fq6hx" podStartSLOduration=66.724256319 podStartE2EDuration="1m7.770258532s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:52:01.320491142 +0000 UTC m=+66.532434273" lastFinishedPulling="2026-05-11 20:52:02.366493354 +0000 UTC m=+67.578436486" observedRunningTime="2026-05-11 20:52:02.768689437 +0000 UTC m=+67.980632618" watchObservedRunningTime="2026-05-11 20:52:02.770258532 +0000 UTC m=+67.982201683" May 11 20:52:04.762166 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:04.762129 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6z6rl" event={"ID":"ff9e0b72-a1d2-4476-a4d4-db6a3425a266","Type":"ContainerStarted","Data":"fc2196e0dbcca74ce4c2c947ab0320ac8a4a270486f79220fac7a3891718604a"} May 11 20:52:04.762596 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:04.762265 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:52:04.777742 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:04.777688 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6z6rl" podStartSLOduration=66.969956007 podStartE2EDuration="1m9.777671319s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:52:01.612470378 +0000 UTC m=+66.824413506" lastFinishedPulling="2026-05-11 20:52:04.420185687 +0000 UTC m=+69.632128818" observedRunningTime="2026-05-11 20:52:04.776193632 +0000 UTC m=+69.988136809" watchObservedRunningTime="2026-05-11 20:52:04.777671319 +0000 UTC m=+69.989614468" May 11 20:52:05.126985 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:05.126899 2555 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:52:05.126985 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:05.126949 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:52:25.131998 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:25.131967 2555 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:52:25.135880 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:25.135848 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-56cfb48457-79rdn" May 11 20:52:35.769572 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:35.769539 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6z6rl" May 11 20:52:52.192221 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:52.192180 2555 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:52:52.211762 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:52.211736 2555 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:52:52.928123 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:52:52.928097 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:00.689547 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.689510 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 11 20:53:00.689971 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.689942 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="alertmanager" containerID="cri-o://67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c" gracePeriod=120 May 11 20:53:00.690041 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.690001 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy-metric" containerID="cri-o://3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651" gracePeriod=120 May 11 20:53:00.690099 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.690071 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="prom-label-proxy" containerID="cri-o://df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a" gracePeriod=120 May 11 20:53:00.690149 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.690080 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy-web" containerID="cri-o://9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9" gracePeriod=120 May 11 20:53:00.690149 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.690113 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy" containerID="cri-o://6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e" gracePeriod=120 May 11 20:53:00.690239 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.690127 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="config-reloader" containerID="cri-o://2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23" gracePeriod=120 May 11 20:53:00.938193 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.938156 2555 generic.go:358] "Generic (PLEG): container finished" podID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerID="df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a" exitCode=0 May 11 20:53:00.938193 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.938182 2555 generic.go:358] "Generic (PLEG): container finished" podID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerID="6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e" exitCode=0 May 11 20:53:00.938193 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.938188 2555 generic.go:358] "Generic (PLEG): container finished" podID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerID="2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23" exitCode=0 May 11 20:53:00.938193 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.938193 2555 generic.go:358] "Generic (PLEG): container finished" podID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerID="67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c" exitCode=0 May 11 20:53:00.938488 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.938229 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerDied","Data":"df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a"} May 11 20:53:00.938488 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.938271 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerDied","Data":"6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e"} May 11 20:53:00.938488 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.938285 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerDied","Data":"2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23"} May 11 20:53:00.938488 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:00.938297 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerDied","Data":"67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c"} May 11 20:53:01.933592 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.933566 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:01.943532 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.943504 2555 generic.go:358] "Generic (PLEG): container finished" podID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerID="3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651" exitCode=0 May 11 20:53:01.943532 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.943530 2555 generic.go:358] "Generic (PLEG): container finished" podID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerID="9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9" exitCode=0 May 11 20:53:01.943727 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.943596 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerDied","Data":"3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651"} May 11 20:53:01.943727 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.943608 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:01.943727 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.943638 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerDied","Data":"9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9"} May 11 20:53:01.943727 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.943652 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28","Type":"ContainerDied","Data":"09f207a23f92f0ea89aac2f83680985fc412123af7bf9d0f724f67d1dda35c55"} May 11 20:53:01.943727 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.943667 2555 scope.go:117] "RemoveContainer" containerID="df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a" May 11 20:53:01.951535 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.951517 2555 scope.go:117] "RemoveContainer" containerID="3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651" May 11 20:53:01.959315 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.959293 2555 scope.go:117] "RemoveContainer" containerID="6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e" May 11 20:53:01.967960 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.967366 2555 scope.go:117] "RemoveContainer" containerID="9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9" May 11 20:53:01.975670 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.975650 2555 scope.go:117] "RemoveContainer" containerID="2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23" May 11 20:53:01.984541 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.984515 2555 scope.go:117] "RemoveContainer" containerID="67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c" May 11 20:53:01.991894 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:01.991872 2555 scope.go:117] "RemoveContainer" containerID="7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490" May 11 20:53:02.004266 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.004245 2555 scope.go:117] "RemoveContainer" containerID="df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a" May 11 20:53:02.004601 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:02.004581 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a\": container with ID starting with df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a not found: ID does not exist" containerID="df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a" May 11 20:53:02.004695 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.004613 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a"} err="failed to get container status \"df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a\": rpc error: code = NotFound desc = could not find container \"df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a\": container with ID starting with df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a not found: ID does not exist" May 11 20:53:02.004695 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.004659 2555 scope.go:117] "RemoveContainer" containerID="3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651" May 11 20:53:02.004935 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:02.004918 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651\": container with ID starting with 3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651 not found: ID does not exist" containerID="3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651" May 11 20:53:02.004981 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.004944 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651"} err="failed to get container status \"3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651\": rpc error: code = NotFound desc = could not find container \"3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651\": container with ID starting with 3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651 not found: ID does not exist" May 11 20:53:02.004981 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.004965 2555 scope.go:117] "RemoveContainer" containerID="6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e" May 11 20:53:02.005194 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:02.005177 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e\": container with ID starting with 6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e not found: ID does not exist" containerID="6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e" May 11 20:53:02.005235 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.005197 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e"} err="failed to get container status \"6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e\": rpc error: code = NotFound desc = could not find container \"6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e\": container with ID starting with 6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e not found: ID does not exist" May 11 20:53:02.005235 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.005208 2555 scope.go:117] "RemoveContainer" containerID="9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9" May 11 20:53:02.005429 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:02.005388 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9\": container with ID starting with 9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9 not found: ID does not exist" containerID="9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9" May 11 20:53:02.005519 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.005475 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9"} err="failed to get container status \"9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9\": rpc error: code = NotFound desc = could not find container \"9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9\": container with ID starting with 9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9 not found: ID does not exist" May 11 20:53:02.005519 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.005499 2555 scope.go:117] "RemoveContainer" containerID="2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23" May 11 20:53:02.005769 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:02.005744 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23\": container with ID starting with 2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23 not found: ID does not exist" containerID="2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23" May 11 20:53:02.005815 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.005776 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23"} err="failed to get container status \"2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23\": rpc error: code = NotFound desc = could not find container \"2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23\": container with ID starting with 2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23 not found: ID does not exist" May 11 20:53:02.005815 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.005793 2555 scope.go:117] "RemoveContainer" containerID="67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c" May 11 20:53:02.006017 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:02.006002 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c\": container with ID starting with 67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c not found: ID does not exist" containerID="67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c" May 11 20:53:02.006058 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006023 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c"} err="failed to get container status \"67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c\": rpc error: code = NotFound desc = could not find container \"67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c\": container with ID starting with 67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c not found: ID does not exist" May 11 20:53:02.006058 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006036 2555 scope.go:117] "RemoveContainer" containerID="7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490" May 11 20:53:02.006252 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:02.006235 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490\": container with ID starting with 7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490 not found: ID does not exist" containerID="7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490" May 11 20:53:02.006324 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006258 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490"} err="failed to get container status \"7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490\": rpc error: code = NotFound desc = could not find container \"7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490\": container with ID starting with 7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490 not found: ID does not exist" May 11 20:53:02.006324 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006276 2555 scope.go:117] "RemoveContainer" containerID="df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a" May 11 20:53:02.006545 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006529 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a"} err="failed to get container status \"df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a\": rpc error: code = NotFound desc = could not find container \"df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a\": container with ID starting with df6f3a9ee459c1240705709dffce0fbfc54c670cf7487e5cb551249fb4678b5a not found: ID does not exist" May 11 20:53:02.006613 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006546 2555 scope.go:117] "RemoveContainer" containerID="3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651" May 11 20:53:02.006734 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006718 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651"} err="failed to get container status \"3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651\": rpc error: code = NotFound desc = could not find container \"3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651\": container with ID starting with 3dcb3150575535632ef8ede1fa9b83aac254b65323de2272e85ee263e54c7651 not found: ID does not exist" May 11 20:53:02.006787 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006734 2555 scope.go:117] "RemoveContainer" containerID="6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e" May 11 20:53:02.006935 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006921 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e"} err="failed to get container status \"6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e\": rpc error: code = NotFound desc = could not find container \"6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e\": container with ID starting with 6ecd10318588e40919af87cf22a5c509c9008dee217e9a781474d9ed7c24404e not found: ID does not exist" May 11 20:53:02.006979 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.006935 2555 scope.go:117] "RemoveContainer" containerID="9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9" May 11 20:53:02.007166 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.007150 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9"} err="failed to get container status \"9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9\": rpc error: code = NotFound desc = could not find container \"9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9\": container with ID starting with 9e95209df0c47fdaca2c4da9b614ce04123507d030b6f2ce2c472a81eb19f9a9 not found: ID does not exist" May 11 20:53:02.007166 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.007166 2555 scope.go:117] "RemoveContainer" containerID="2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23" May 11 20:53:02.007358 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.007343 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23"} err="failed to get container status \"2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23\": rpc error: code = NotFound desc = could not find container \"2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23\": container with ID starting with 2dcd924ec3170aa9df5a1f712dc5a51f75c327fa07c7500fc345756ee84bab23 not found: ID does not exist" May 11 20:53:02.007398 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.007358 2555 scope.go:117] "RemoveContainer" containerID="67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c" May 11 20:53:02.007579 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.007559 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c"} err="failed to get container status \"67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c\": rpc error: code = NotFound desc = could not find container \"67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c\": container with ID starting with 67a28476e2f797a4a4dd3c11ac62d2b140efa3faddb633fc60f9fd54444d584c not found: ID does not exist" May 11 20:53:02.007646 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.007582 2555 scope.go:117] "RemoveContainer" containerID="7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490" May 11 20:53:02.007774 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.007757 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490"} err="failed to get container status \"7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490\": rpc error: code = NotFound desc = could not find container \"7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490\": container with ID starting with 7c8f78e52bb34e7bc07cd92704c7f94f764fcb5ef407bba94f745dc1fb61b490 not found: ID does not exist" May 11 20:53:02.028088 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028063 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028192 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028099 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-main-db\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028192 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028125 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-tls-assets\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028192 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028145 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028192 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028172 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-web\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028384 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028210 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-metrics-client-ca\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028384 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028229 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmj86\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-kube-api-access-wmj86\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028384 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028247 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-out\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028384 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028283 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-trusted-ca-bundle\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028384 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028315 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-web-config\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028384 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028367 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-main-tls\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028751 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028420 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-volume\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028751 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028447 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-cluster-tls-config\") pod \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\" (UID: \"fb273bbf-f5bb-44c6-bcba-41e2cbf4db28\") " May 11 20:53:02.028751 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028470 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:53:02.028751 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028566 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:02.029128 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028745 2555 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-metrics-client-ca\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.029128 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.028820 2555 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-main-db\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.029613 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.029565 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:02.032106 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.032041 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:02.032106 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.032073 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:53:02.032106 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.032078 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:02.032998 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.032944 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:02.032998 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.032960 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-out" (OuterVolumeSpecName: "config-out") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:53:02.033120 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.033020 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:02.033813 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.033793 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-kube-api-access-wmj86" (OuterVolumeSpecName: "kube-api-access-wmj86") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "kube-api-access-wmj86". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:53:02.034533 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.034501 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:02.036049 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.036023 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:02.043164 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.043134 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-web-config" (OuterVolumeSpecName: "web-config") pod "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" (UID: "fb273bbf-f5bb-44c6-bcba-41e2cbf4db28"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:02.129772 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129717 2555 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-volume\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.129772 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129766 2555 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-cluster-tls-config\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.129772 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129780 2555 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.130027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129795 2555 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-tls-assets\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.130027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129808 2555 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.130027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129821 2555 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.130027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129833 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wmj86\" (UniqueName: \"kubernetes.io/projected/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-kube-api-access-wmj86\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.130027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129845 2555 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-config-out\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.130027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129857 2555 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.130027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129868 2555 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-web-config\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.130027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.129880 2555 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28-secret-alertmanager-main-tls\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:02.267019 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.266942 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 11 20:53:02.270631 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.270595 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 11 20:53:02.299581 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299139 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299590 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy-web" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299608 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy-web" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299620 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy-metric" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299628 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy-metric" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299644 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299652 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299667 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="alertmanager" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299675 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="alertmanager" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299689 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="init-config-reloader" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299697 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="init-config-reloader" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299707 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="prom-label-proxy" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299715 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="prom-label-proxy" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299725 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="config-reloader" May 11 20:53:02.299771 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299732 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="config-reloader" May 11 20:53:02.300394 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299802 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="alertmanager" May 11 20:53:02.300394 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299816 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="config-reloader" May 11 20:53:02.300394 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299825 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy-metric" May 11 20:53:02.300394 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299836 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="prom-label-proxy" May 11 20:53:02.300394 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299848 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy-web" May 11 20:53:02.300394 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.299857 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" containerName="kube-rbac-proxy" May 11 20:53:02.305535 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.305512 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.308206 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.308181 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-h54qp\"" May 11 20:53:02.308322 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.308205 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" May 11 20:53:02.308470 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.308454 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" May 11 20:53:02.308530 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.308483 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" May 11 20:53:02.308795 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.308774 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" May 11 20:53:02.308795 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.308787 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" May 11 20:53:02.308955 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.308830 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" May 11 20:53:02.308955 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.308842 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" May 11 20:53:02.308955 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.308845 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" May 11 20:53:02.314273 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.314255 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" May 11 20:53:02.317334 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.317312 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 11 20:53:02.432792 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.432741 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dfxp\" (UniqueName: \"kubernetes.io/projected/caa21e61-935d-4cc5-890f-e3ef50fadd2a-kube-api-access-6dfxp\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.432955 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.432861 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-config-volume\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.432955 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.432887 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.432955 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.432908 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-web-config\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.432955 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.432937 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/caa21e61-935d-4cc5-890f-e3ef50fadd2a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.433143 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.433011 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.433143 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.433045 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/caa21e61-935d-4cc5-890f-e3ef50fadd2a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.433143 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.433079 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/caa21e61-935d-4cc5-890f-e3ef50fadd2a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.433143 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.433095 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.433334 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.433154 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/caa21e61-935d-4cc5-890f-e3ef50fadd2a-config-out\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.433334 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.433210 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.433334 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.433242 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.433334 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.433267 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/caa21e61-935d-4cc5-890f-e3ef50fadd2a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534273 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534173 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dfxp\" (UniqueName: \"kubernetes.io/projected/caa21e61-935d-4cc5-890f-e3ef50fadd2a-kube-api-access-6dfxp\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534469 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534267 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-config-volume\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534469 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534300 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534469 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534325 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-web-config\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534469 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534354 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/caa21e61-935d-4cc5-890f-e3ef50fadd2a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534469 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534396 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534469 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534453 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/caa21e61-935d-4cc5-890f-e3ef50fadd2a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534795 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534526 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/caa21e61-935d-4cc5-890f-e3ef50fadd2a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534795 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534554 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534795 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534593 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/caa21e61-935d-4cc5-890f-e3ef50fadd2a-config-out\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534795 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534649 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534795 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534680 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.534795 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.534702 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/caa21e61-935d-4cc5-890f-e3ef50fadd2a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.535286 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.535258 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/caa21e61-935d-4cc5-890f-e3ef50fadd2a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.536136 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.535580 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/caa21e61-935d-4cc5-890f-e3ef50fadd2a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.536136 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.535988 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/caa21e61-935d-4cc5-890f-e3ef50fadd2a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.537763 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.537668 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-config-volume\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.537763 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.537723 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-web-config\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.538055 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.538034 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.538304 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.538280 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/caa21e61-935d-4cc5-890f-e3ef50fadd2a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.538433 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.538393 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.538913 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.538887 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.539164 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.539143 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.539829 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.539808 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/caa21e61-935d-4cc5-890f-e3ef50fadd2a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.540067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.540043 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/caa21e61-935d-4cc5-890f-e3ef50fadd2a-config-out\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.547382 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.547359 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dfxp\" (UniqueName: \"kubernetes.io/projected/caa21e61-935d-4cc5-890f-e3ef50fadd2a-kube-api-access-6dfxp\") pod \"alertmanager-main-0\" (UID: \"caa21e61-935d-4cc5-890f-e3ef50fadd2a\") " pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.615579 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.615542 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" May 11 20:53:02.748399 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.748374 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 11 20:53:02.750776 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:53:02.750744 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa21e61_935d_4cc5_890f_e3ef50fadd2a.slice/crio-334e0dfc13a8c398d9e40ff8ad23b11428dec2085aa1fa335cfc42922c255e37 WatchSource:0}: Error finding container 334e0dfc13a8c398d9e40ff8ad23b11428dec2085aa1fa335cfc42922c255e37: Status 404 returned error can't find the container with id 334e0dfc13a8c398d9e40ff8ad23b11428dec2085aa1fa335cfc42922c255e37 May 11 20:53:02.954255 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.954218 2555 generic.go:358] "Generic (PLEG): container finished" podID="caa21e61-935d-4cc5-890f-e3ef50fadd2a" containerID="ccc060ec6df0d37e55dcca6ba138abe0bb64b49af8b5b7b0e12870fbc7afe8fa" exitCode=0 May 11 20:53:02.954640 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.954291 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"caa21e61-935d-4cc5-890f-e3ef50fadd2a","Type":"ContainerDied","Data":"ccc060ec6df0d37e55dcca6ba138abe0bb64b49af8b5b7b0e12870fbc7afe8fa"} May 11 20:53:02.954640 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:02.954312 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"caa21e61-935d-4cc5-890f-e3ef50fadd2a","Type":"ContainerStarted","Data":"334e0dfc13a8c398d9e40ff8ad23b11428dec2085aa1fa335cfc42922c255e37"} May 11 20:53:03.359133 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:03.359054 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb273bbf-f5bb-44c6-bcba-41e2cbf4db28" path="/var/lib/kubelet/pods/fb273bbf-f5bb-44c6-bcba-41e2cbf4db28/volumes" May 11 20:53:03.960288 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:03.960252 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"caa21e61-935d-4cc5-890f-e3ef50fadd2a","Type":"ContainerStarted","Data":"60766b8818cc308f909650113481de5c070fc260ee0bfc04504900489ecd3ef2"} May 11 20:53:03.960288 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:03.960289 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"caa21e61-935d-4cc5-890f-e3ef50fadd2a","Type":"ContainerStarted","Data":"31a4bacf4c914efbe71b57fc53c21745dcd2bab26c3eed5e93c0a9a1d43107de"} May 11 20:53:03.960697 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:03.960304 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"caa21e61-935d-4cc5-890f-e3ef50fadd2a","Type":"ContainerStarted","Data":"5c0829c1330b9426d1a72a00671c9845ceb863176682468dfc82c2cdfba43cdc"} May 11 20:53:03.960697 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:03.960313 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"caa21e61-935d-4cc5-890f-e3ef50fadd2a","Type":"ContainerStarted","Data":"70db4b1f5ad6779c5e12bc52e1ac4e065d70042b5aea7fbec10874e5718aecbb"} May 11 20:53:03.960697 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:03.960322 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"caa21e61-935d-4cc5-890f-e3ef50fadd2a","Type":"ContainerStarted","Data":"d6834813ec6c9ecc9363bb0ce81c10f6b131783d73d57e094ad0949399e08f62"} May 11 20:53:03.960697 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:03.960329 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"caa21e61-935d-4cc5-890f-e3ef50fadd2a","Type":"ContainerStarted","Data":"bf2ee76417bd74fc779c4b5466a59ef58ce72aa36b41fe22f02a949ee95feefd"} May 11 20:53:03.990723 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:03.990676 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.990660272 podStartE2EDuration="1.990660272s" podCreationTimestamp="2026-05-11 20:53:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:53:03.988660825 +0000 UTC m=+129.200603986" watchObservedRunningTime="2026-05-11 20:53:03.990660272 +0000 UTC m=+129.202603421" May 11 20:53:04.717947 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.717913 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq"] May 11 20:53:04.722041 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.722021 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.724830 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.724805 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" May 11 20:53:04.725145 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.725122 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" May 11 20:53:04.725145 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.725129 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" May 11 20:53:04.725293 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.725202 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-7rbcc\"" May 11 20:53:04.725293 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.725128 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" May 11 20:53:04.725293 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.725129 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" May 11 20:53:04.730820 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.730561 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" May 11 20:53:04.734049 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.734025 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq"] May 11 20:53:04.853776 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.853737 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-telemeter-client-tls\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.853933 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.853785 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23302639-969b-4a5a-bd5b-614af2afac30-metrics-client-ca\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.853933 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.853813 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23302639-969b-4a5a-bd5b-614af2afac30-serving-certs-ca-bundle\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.853933 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.853836 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjt6q\" (UniqueName: \"kubernetes.io/projected/23302639-969b-4a5a-bd5b-614af2afac30-kube-api-access-tjt6q\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.853933 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.853865 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-federate-client-tls\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.854088 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.854025 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23302639-969b-4a5a-bd5b-614af2afac30-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.854088 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.854072 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.854161 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.854105 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-secret-telemeter-client\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.954950 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.954918 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23302639-969b-4a5a-bd5b-614af2afac30-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.955102 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.954957 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.955102 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.954989 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-secret-telemeter-client\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.955102 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.955045 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-telemeter-client-tls\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.955102 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.955088 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23302639-969b-4a5a-bd5b-614af2afac30-metrics-client-ca\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.955316 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.955118 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23302639-969b-4a5a-bd5b-614af2afac30-serving-certs-ca-bundle\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.955316 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.955144 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjt6q\" (UniqueName: \"kubernetes.io/projected/23302639-969b-4a5a-bd5b-614af2afac30-kube-api-access-tjt6q\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.955316 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.955168 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-federate-client-tls\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.956066 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.956040 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23302639-969b-4a5a-bd5b-614af2afac30-metrics-client-ca\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.956198 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.956174 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23302639-969b-4a5a-bd5b-614af2afac30-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.956238 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.956213 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23302639-969b-4a5a-bd5b-614af2afac30-serving-certs-ca-bundle\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.957860 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.957833 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-secret-telemeter-client\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.957956 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.957878 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-federate-client-tls\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.957956 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.957938 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.958138 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.958116 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/23302639-969b-4a5a-bd5b-614af2afac30-telemeter-client-tls\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.964194 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.964166 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjt6q\" (UniqueName: \"kubernetes.io/projected/23302639-969b-4a5a-bd5b-614af2afac30-kube-api-access-tjt6q\") pod \"telemeter-client-7b57cbb8cd-gjvmq\" (UID: \"23302639-969b-4a5a-bd5b-614af2afac30\") " pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:04.988208 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.988172 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 11 20:53:04.988986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.988781 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="prometheus" containerID="cri-o://5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" gracePeriod=600 May 11 20:53:04.988986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.988805 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy" containerID="cri-o://6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" gracePeriod=600 May 11 20:53:04.988986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.988810 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="thanos-sidecar" containerID="cri-o://4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" gracePeriod=600 May 11 20:53:04.988986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.988841 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy-web" containerID="cri-o://2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" gracePeriod=600 May 11 20:53:04.988986 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.988934 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" gracePeriod=600 May 11 20:53:04.989325 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:04.988975 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="config-reloader" containerID="cri-o://3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" gracePeriod=600 May 11 20:53:05.034360 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.034331 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" May 11 20:53:05.223083 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.223027 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq"] May 11 20:53:05.226799 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:53:05.226773 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23302639_969b_4a5a_bd5b_614af2afac30.slice/crio-7059db5dad1f3a3f1142e584dabe9e44af35b85e2f1718b9dc9efbe240404b21 WatchSource:0}: Error finding container 7059db5dad1f3a3f1142e584dabe9e44af35b85e2f1718b9dc9efbe240404b21: Status 404 returned error can't find the container with id 7059db5dad1f3a3f1142e584dabe9e44af35b85e2f1718b9dc9efbe240404b21 May 11 20:53:05.268997 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.268976 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:05.360551 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360519 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf688\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-kube-api-access-mf688\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360725 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360557 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-db\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360725 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360586 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360725 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360632 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360725 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360664 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-web-config\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360725 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360691 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-metrics-client-ca\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360965 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360729 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-trusted-ca-bundle\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360965 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360758 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-config\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360965 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360806 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-kubelet-serving-ca-bundle\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360965 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360839 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-metrics-client-certs\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360965 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360873 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-thanos-prometheus-http-client-file\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360965 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360906 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-grpc-tls\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.360965 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360936 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-tls-assets\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.361848 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.360971 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-kube-rbac-proxy\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.361848 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.361005 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-config-out\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.361848 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.361034 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-tls\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.361848 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.361066 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-rulefiles-0\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.361848 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.361090 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-serving-certs-ca-bundle\") pod \"c41ae7e7-b224-431f-92c6-faef8aeb0669\" (UID: \"c41ae7e7-b224-431f-92c6-faef8aeb0669\") " May 11 20:53:05.361848 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.361273 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:05.361848 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.361830 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:53:05.362209 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.361873 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:05.362209 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.362143 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:05.363090 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.362768 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:05.363424 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.363363 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-kube-api-access-mf688" (OuterVolumeSpecName: "kube-api-access-mf688") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "kube-api-access-mf688". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:53:05.364368 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.364115 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:05.364368 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.364274 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-config" (OuterVolumeSpecName: "config") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:05.364618 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.364586 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:05.365430 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.365165 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-config-out" (OuterVolumeSpecName: "config-out") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:53:05.365430 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.365315 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:53:05.365430 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.365315 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:05.366023 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.365980 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:05.366584 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.366556 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:05.366584 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.366571 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:05.366782 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.366761 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:05.367334 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.367295 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:05.374961 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.374931 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-web-config" (OuterVolumeSpecName: "web-config") pod "c41ae7e7-b224-431f-92c6-faef8aeb0669" (UID: "c41ae7e7-b224-431f-92c6-faef8aeb0669"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:05.462439 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462371 2555 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462439 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462398 2555 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-web-config\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462439 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462437 2555 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-metrics-client-ca\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462439 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462447 2555 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462457 2555 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-config\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462467 2555 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462477 2555 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-metrics-client-certs\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462487 2555 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462496 2555 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-grpc-tls\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462505 2555 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-tls-assets\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462523 2555 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-kube-rbac-proxy\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462531 2555 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-config-out\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462542 2555 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462550 2555 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462561 2555 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41ae7e7-b224-431f-92c6-faef8aeb0669-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462569 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mf688\" (UniqueName: \"kubernetes.io/projected/c41ae7e7-b224-431f-92c6-faef8aeb0669-kube-api-access-mf688\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462579 2555 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c41ae7e7-b224-431f-92c6-faef8aeb0669-prometheus-k8s-db\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.462696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.462587 2555 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c41ae7e7-b224-431f-92c6-faef8aeb0669-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:53:05.969650 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.969605 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" event={"ID":"23302639-969b-4a5a-bd5b-614af2afac30","Type":"ContainerStarted","Data":"7059db5dad1f3a3f1142e584dabe9e44af35b85e2f1718b9dc9efbe240404b21"} May 11 20:53:05.972980 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.972954 2555 generic.go:358] "Generic (PLEG): container finished" podID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerID="9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" exitCode=0 May 11 20:53:05.972980 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.972980 2555 generic.go:358] "Generic (PLEG): container finished" podID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerID="6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" exitCode=0 May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.972988 2555 generic.go:358] "Generic (PLEG): container finished" podID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerID="2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" exitCode=0 May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.972996 2555 generic.go:358] "Generic (PLEG): container finished" podID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerID="4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" exitCode=0 May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973003 2555 generic.go:358] "Generic (PLEG): container finished" podID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerID="3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" exitCode=0 May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973014 2555 generic.go:358] "Generic (PLEG): container finished" podID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerID="5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" exitCode=0 May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973043 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerDied","Data":"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2"} May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973076 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973096 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerDied","Data":"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45"} May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973115 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerDied","Data":"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7"} May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973130 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerDied","Data":"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17"} May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973145 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerDied","Data":"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339"} May 11 20:53:05.973163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973159 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerDied","Data":"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea"} May 11 20:53:05.973555 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973173 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c41ae7e7-b224-431f-92c6-faef8aeb0669","Type":"ContainerDied","Data":"3c245c7a19899544ffab58f485e5be824e379fa8210ef470aa040065f386c764"} May 11 20:53:05.973555 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.973180 2555 scope.go:117] "RemoveContainer" containerID="9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" May 11 20:53:05.984465 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.984443 2555 scope.go:117] "RemoveContainer" containerID="6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" May 11 20:53:05.993584 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:05.993399 2555 scope.go:117] "RemoveContainer" containerID="2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" May 11 20:53:06.000070 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.000048 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 11 20:53:06.002445 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.002224 2555 scope.go:117] "RemoveContainer" containerID="4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" May 11 20:53:06.004741 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.004720 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 11 20:53:06.010896 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.010873 2555 scope.go:117] "RemoveContainer" containerID="3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" May 11 20:53:06.018333 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.018316 2555 scope.go:117] "RemoveContainer" containerID="5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" May 11 20:53:06.026883 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.026852 2555 scope.go:117] "RemoveContainer" containerID="43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46" May 11 20:53:06.030716 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.030694 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 11 20:53:06.031152 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031136 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="init-config-reloader" May 11 20:53:06.031247 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031154 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="init-config-reloader" May 11 20:53:06.031247 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031170 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy" May 11 20:53:06.031247 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031179 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy" May 11 20:53:06.031247 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031195 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy-thanos" May 11 20:53:06.031247 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031204 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy-thanos" May 11 20:53:06.031247 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031247 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="prometheus" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031257 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="prometheus" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031269 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="thanos-sidecar" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031276 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="thanos-sidecar" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031294 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="config-reloader" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031301 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="config-reloader" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031312 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy-web" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031320 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy-web" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031441 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="prometheus" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031458 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy-thanos" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031474 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="thanos-sidecar" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031486 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="config-reloader" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031527 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy" May 11 20:53:06.031694 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.031536 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" containerName="kube-rbac-proxy-web" May 11 20:53:06.036374 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.036358 2555 scope.go:117] "RemoveContainer" containerID="9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" May 11 20:53:06.036739 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:06.036715 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": container with ID starting with 9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2 not found: ID does not exist" containerID="9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" May 11 20:53:06.036822 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.036751 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2"} err="failed to get container status \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": rpc error: code = NotFound desc = could not find container \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": container with ID starting with 9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2 not found: ID does not exist" May 11 20:53:06.036822 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.036776 2555 scope.go:117] "RemoveContainer" containerID="6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" May 11 20:53:06.037082 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:06.037048 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": container with ID starting with 6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45 not found: ID does not exist" containerID="6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" May 11 20:53:06.037150 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.037080 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.037150 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.037086 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45"} err="failed to get container status \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": rpc error: code = NotFound desc = could not find container \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": container with ID starting with 6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45 not found: ID does not exist" May 11 20:53:06.037150 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.037100 2555 scope.go:117] "RemoveContainer" containerID="2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" May 11 20:53:06.042460 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:06.042145 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": container with ID starting with 2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7 not found: ID does not exist" containerID="2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" May 11 20:53:06.042460 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042192 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7"} err="failed to get container status \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": rpc error: code = NotFound desc = could not find container \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": container with ID starting with 2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7 not found: ID does not exist" May 11 20:53:06.042460 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042215 2555 scope.go:117] "RemoveContainer" containerID="4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:06.042499 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": container with ID starting with 4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17 not found: ID does not exist" containerID="4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042499 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042536 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042538 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17"} err="failed to get container status \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": rpc error: code = NotFound desc = could not find container \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": container with ID starting with 4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17 not found: ID does not exist" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042561 2555 scope.go:117] "RemoveContainer" containerID="3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042593 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-f0r19f6ett59e\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042624 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042694 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042696 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042493 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042694 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042836 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.042926 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:06.043016 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": container with ID starting with 3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339 not found: ID does not exist" containerID="3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.043055 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339"} err="failed to get container status \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": rpc error: code = NotFound desc = could not find container \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": container with ID starting with 3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339 not found: ID does not exist" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.043077 2555 scope.go:117] "RemoveContainer" containerID="5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.043265 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-4gcn8\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:06.043393 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": container with ID starting with 5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea not found: ID does not exist" containerID="5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.043515 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea"} err="failed to get container status \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": rpc error: code = NotFound desc = could not find container \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": container with ID starting with 5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea not found: ID does not exist" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.043540 2555 scope.go:117] "RemoveContainer" containerID="43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.043702 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" May 11 20:53:06.044712 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:53:06.044359 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": container with ID starting with 43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46 not found: ID does not exist" containerID="43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46" May 11 20:53:06.046389 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.044389 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46"} err="failed to get container status \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": rpc error: code = NotFound desc = could not find container \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": container with ID starting with 43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46 not found: ID does not exist" May 11 20:53:06.046389 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.044466 2555 scope.go:117] "RemoveContainer" containerID="9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" May 11 20:53:06.046389 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.045144 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2"} err="failed to get container status \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": rpc error: code = NotFound desc = could not find container \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": container with ID starting with 9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2 not found: ID does not exist" May 11 20:53:06.046389 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.045170 2555 scope.go:117] "RemoveContainer" containerID="6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" May 11 20:53:06.046389 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.045958 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45"} err="failed to get container status \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": rpc error: code = NotFound desc = could not find container \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": container with ID starting with 6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45 not found: ID does not exist" May 11 20:53:06.046389 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.045982 2555 scope.go:117] "RemoveContainer" containerID="2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" May 11 20:53:06.047056 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.047031 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" May 11 20:53:06.047655 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.047632 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7"} err="failed to get container status \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": rpc error: code = NotFound desc = could not find container \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": container with ID starting with 2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7 not found: ID does not exist" May 11 20:53:06.047655 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.047653 2555 scope.go:117] "RemoveContainer" containerID="4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" May 11 20:53:06.047965 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.047946 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17"} err="failed to get container status \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": rpc error: code = NotFound desc = could not find container \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": container with ID starting with 4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17 not found: ID does not exist" May 11 20:53:06.047965 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.047965 2555 scope.go:117] "RemoveContainer" containerID="3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" May 11 20:53:06.048205 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.048186 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339"} err="failed to get container status \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": rpc error: code = NotFound desc = could not find container \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": container with ID starting with 3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339 not found: ID does not exist" May 11 20:53:06.048205 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.048205 2555 scope.go:117] "RemoveContainer" containerID="5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" May 11 20:53:06.049070 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.048475 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea"} err="failed to get container status \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": rpc error: code = NotFound desc = could not find container \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": container with ID starting with 5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea not found: ID does not exist" May 11 20:53:06.049070 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.048498 2555 scope.go:117] "RemoveContainer" containerID="43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46" May 11 20:53:06.049070 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.048796 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46"} err="failed to get container status \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": rpc error: code = NotFound desc = could not find container \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": container with ID starting with 43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46 not found: ID does not exist" May 11 20:53:06.049070 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.048819 2555 scope.go:117] "RemoveContainer" containerID="9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" May 11 20:53:06.049070 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.048839 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 11 20:53:06.049070 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.049067 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2"} err="failed to get container status \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": rpc error: code = NotFound desc = could not find container \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": container with ID starting with 9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2 not found: ID does not exist" May 11 20:53:06.049447 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.049111 2555 scope.go:117] "RemoveContainer" containerID="6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" May 11 20:53:06.049630 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.049581 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45"} err="failed to get container status \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": rpc error: code = NotFound desc = could not find container \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": container with ID starting with 6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45 not found: ID does not exist" May 11 20:53:06.049748 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.049735 2555 scope.go:117] "RemoveContainer" containerID="2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" May 11 20:53:06.049873 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.049852 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" May 11 20:53:06.050253 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.050226 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7"} err="failed to get container status \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": rpc error: code = NotFound desc = could not find container \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": container with ID starting with 2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7 not found: ID does not exist" May 11 20:53:06.050345 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.050253 2555 scope.go:117] "RemoveContainer" containerID="4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" May 11 20:53:06.050755 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.050727 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17"} err="failed to get container status \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": rpc error: code = NotFound desc = could not find container \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": container with ID starting with 4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17 not found: ID does not exist" May 11 20:53:06.050854 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.050756 2555 scope.go:117] "RemoveContainer" containerID="3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" May 11 20:53:06.051543 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.051517 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339"} err="failed to get container status \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": rpc error: code = NotFound desc = could not find container \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": container with ID starting with 3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339 not found: ID does not exist" May 11 20:53:06.051626 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.051545 2555 scope.go:117] "RemoveContainer" containerID="5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" May 11 20:53:06.051806 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.051784 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea"} err="failed to get container status \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": rpc error: code = NotFound desc = could not find container \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": container with ID starting with 5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea not found: ID does not exist" May 11 20:53:06.051806 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.051806 2555 scope.go:117] "RemoveContainer" containerID="43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46" May 11 20:53:06.052061 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.052042 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46"} err="failed to get container status \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": rpc error: code = NotFound desc = could not find container \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": container with ID starting with 43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46 not found: ID does not exist" May 11 20:53:06.052142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.052062 2555 scope.go:117] "RemoveContainer" containerID="9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" May 11 20:53:06.052353 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.052331 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2"} err="failed to get container status \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": rpc error: code = NotFound desc = could not find container \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": container with ID starting with 9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2 not found: ID does not exist" May 11 20:53:06.052503 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.052491 2555 scope.go:117] "RemoveContainer" containerID="6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" May 11 20:53:06.052980 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.052790 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45"} err="failed to get container status \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": rpc error: code = NotFound desc = could not find container \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": container with ID starting with 6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45 not found: ID does not exist" May 11 20:53:06.052980 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.052984 2555 scope.go:117] "RemoveContainer" containerID="2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" May 11 20:53:06.053311 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.053232 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7"} err="failed to get container status \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": rpc error: code = NotFound desc = could not find container \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": container with ID starting with 2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7 not found: ID does not exist" May 11 20:53:06.053311 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.053256 2555 scope.go:117] "RemoveContainer" containerID="4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" May 11 20:53:06.053696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.053665 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17"} err="failed to get container status \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": rpc error: code = NotFound desc = could not find container \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": container with ID starting with 4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17 not found: ID does not exist" May 11 20:53:06.053696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.053698 2555 scope.go:117] "RemoveContainer" containerID="3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" May 11 20:53:06.053923 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.053904 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339"} err="failed to get container status \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": rpc error: code = NotFound desc = could not find container \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": container with ID starting with 3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339 not found: ID does not exist" May 11 20:53:06.054003 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.053925 2555 scope.go:117] "RemoveContainer" containerID="5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" May 11 20:53:06.054192 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.054154 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea"} err="failed to get container status \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": rpc error: code = NotFound desc = could not find container \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": container with ID starting with 5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea not found: ID does not exist" May 11 20:53:06.054192 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.054174 2555 scope.go:117] "RemoveContainer" containerID="43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46" May 11 20:53:06.054444 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.054392 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46"} err="failed to get container status \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": rpc error: code = NotFound desc = could not find container \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": container with ID starting with 43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46 not found: ID does not exist" May 11 20:53:06.054444 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.054444 2555 scope.go:117] "RemoveContainer" containerID="9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" May 11 20:53:06.054788 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.054766 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2"} err="failed to get container status \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": rpc error: code = NotFound desc = could not find container \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": container with ID starting with 9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2 not found: ID does not exist" May 11 20:53:06.054788 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.054788 2555 scope.go:117] "RemoveContainer" containerID="6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" May 11 20:53:06.055051 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.055031 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45"} err="failed to get container status \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": rpc error: code = NotFound desc = could not find container \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": container with ID starting with 6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45 not found: ID does not exist" May 11 20:53:06.055051 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.055050 2555 scope.go:117] "RemoveContainer" containerID="2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" May 11 20:53:06.055347 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.055322 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7"} err="failed to get container status \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": rpc error: code = NotFound desc = could not find container \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": container with ID starting with 2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7 not found: ID does not exist" May 11 20:53:06.055440 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.055348 2555 scope.go:117] "RemoveContainer" containerID="4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" May 11 20:53:06.055606 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.055583 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17"} err="failed to get container status \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": rpc error: code = NotFound desc = could not find container \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": container with ID starting with 4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17 not found: ID does not exist" May 11 20:53:06.055670 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.055610 2555 scope.go:117] "RemoveContainer" containerID="3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" May 11 20:53:06.055989 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.055967 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339"} err="failed to get container status \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": rpc error: code = NotFound desc = could not find container \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": container with ID starting with 3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339 not found: ID does not exist" May 11 20:53:06.056056 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.055991 2555 scope.go:117] "RemoveContainer" containerID="5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" May 11 20:53:06.056250 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.056224 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea"} err="failed to get container status \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": rpc error: code = NotFound desc = could not find container \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": container with ID starting with 5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea not found: ID does not exist" May 11 20:53:06.056325 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.056251 2555 scope.go:117] "RemoveContainer" containerID="43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46" May 11 20:53:06.056541 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.056513 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46"} err="failed to get container status \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": rpc error: code = NotFound desc = could not find container \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": container with ID starting with 43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46 not found: ID does not exist" May 11 20:53:06.056647 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.056542 2555 scope.go:117] "RemoveContainer" containerID="9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2" May 11 20:53:06.056824 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.056801 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2"} err="failed to get container status \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": rpc error: code = NotFound desc = could not find container \"9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2\": container with ID starting with 9dcfc13c05205280217aa639640697b1d9c975ba38153e8bb01c74f1e1a08cc2 not found: ID does not exist" May 11 20:53:06.056902 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.056827 2555 scope.go:117] "RemoveContainer" containerID="6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45" May 11 20:53:06.057051 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.057028 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45"} err="failed to get container status \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": rpc error: code = NotFound desc = could not find container \"6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45\": container with ID starting with 6d518304fb1185bb10b3ee79eb3fc7d210ab09534b63173a24495982f272df45 not found: ID does not exist" May 11 20:53:06.057109 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.057053 2555 scope.go:117] "RemoveContainer" containerID="2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7" May 11 20:53:06.057309 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.057290 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7"} err="failed to get container status \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": rpc error: code = NotFound desc = could not find container \"2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7\": container with ID starting with 2bc5c6843eeafc0d22a14e913467370d5cb5abf44b79694d57983b1df6cc19c7 not found: ID does not exist" May 11 20:53:06.057378 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.057310 2555 scope.go:117] "RemoveContainer" containerID="4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17" May 11 20:53:06.057602 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.057585 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17"} err="failed to get container status \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": rpc error: code = NotFound desc = could not find container \"4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17\": container with ID starting with 4ec1e86787fa7f4801f243da5bc063bb221c2ce429876b7cfdd29aa362c53c17 not found: ID does not exist" May 11 20:53:06.057666 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.057604 2555 scope.go:117] "RemoveContainer" containerID="3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339" May 11 20:53:06.057850 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.057830 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339"} err="failed to get container status \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": rpc error: code = NotFound desc = could not find container \"3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339\": container with ID starting with 3422eba3a595d211e985053f40903dc9eddd9b233ac3607db0c7d2b3c2c20339 not found: ID does not exist" May 11 20:53:06.057918 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.057853 2555 scope.go:117] "RemoveContainer" containerID="5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea" May 11 20:53:06.058094 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.058075 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea"} err="failed to get container status \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": rpc error: code = NotFound desc = could not find container \"5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea\": container with ID starting with 5c85f51bd1e4fda6167f1bb08c2d2f2c9fffc9c85ef1c99776423cdb0932b6ea not found: ID does not exist" May 11 20:53:06.058168 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.058095 2555 scope.go:117] "RemoveContainer" containerID="43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46" May 11 20:53:06.058388 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.058366 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46"} err="failed to get container status \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": rpc error: code = NotFound desc = could not find container \"43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46\": container with ID starting with 43d2b881dae87a9b3eaeca2382659176172d72efa546253afeb38a6a7d2b7b46 not found: ID does not exist" May 11 20:53:06.168647 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168600 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-config\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.168647 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168646 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-web-config\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.168860 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168669 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.168860 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168761 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.168860 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168805 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9dd9a319-76ec-4631-b200-211f0174ad87-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.168860 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168844 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169012 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168871 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169012 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168915 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169012 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168948 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169012 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.168999 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169172 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.169015 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169172 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.169036 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169172 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.169061 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9dd9a319-76ec-4631-b200-211f0174ad87-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169172 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.169118 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjwxg\" (UniqueName: \"kubernetes.io/projected/9dd9a319-76ec-4631-b200-211f0174ad87-kube-api-access-kjwxg\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169344 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.169219 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169344 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.169250 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169344 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.169298 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.169469 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.169353 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9dd9a319-76ec-4631-b200-211f0174ad87-config-out\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.270637 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.270589 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.270637 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.270647 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.270851 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.270680 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.270851 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.270695 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.270851 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.270715 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.270851 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.270733 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9dd9a319-76ec-4631-b200-211f0174ad87-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271079 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271051 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjwxg\" (UniqueName: \"kubernetes.io/projected/9dd9a319-76ec-4631-b200-211f0174ad87-kube-api-access-kjwxg\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271121 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271163 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271149 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271276 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271207 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271276 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271234 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9dd9a319-76ec-4631-b200-211f0174ad87-config-out\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271276 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271265 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-config\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271436 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271291 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-web-config\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271436 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271317 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271436 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271355 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271436 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271378 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9dd9a319-76ec-4631-b200-211f0174ad87-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271808 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271447 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271808 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271479 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.271808 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271710 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.272984 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.272228 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.272984 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.272647 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.273551 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.273239 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.274216 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.274190 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.274216 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.274206 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.274571 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.274367 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.274571 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.274439 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.274571 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.274456 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-web-config\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.274571 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.271235 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9dd9a319-76ec-4631-b200-211f0174ad87-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.275816 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.275787 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.275977 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.275948 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9dd9a319-76ec-4631-b200-211f0174ad87-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.276230 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.276205 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9dd9a319-76ec-4631-b200-211f0174ad87-config-out\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.276731 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.276714 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.277051 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.277029 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.277122 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.277025 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9dd9a319-76ec-4631-b200-211f0174ad87-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.277457 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.277442 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9dd9a319-76ec-4631-b200-211f0174ad87-config\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.281351 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.281331 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjwxg\" (UniqueName: \"kubernetes.io/projected/9dd9a319-76ec-4631-b200-211f0174ad87-kube-api-access-kjwxg\") pod \"prometheus-k8s-0\" (UID: \"9dd9a319-76ec-4631-b200-211f0174ad87\") " pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.355086 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.355038 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:53:06.504514 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.504395 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 11 20:53:06.928775 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:53:06.928726 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dd9a319_76ec_4631_b200_211f0174ad87.slice/crio-00ead3e933023f5d1baa14209f69201502d91ade1029da7a500adb40d41b2177 WatchSource:0}: Error finding container 00ead3e933023f5d1baa14209f69201502d91ade1029da7a500adb40d41b2177: Status 404 returned error can't find the container with id 00ead3e933023f5d1baa14209f69201502d91ade1029da7a500adb40d41b2177 May 11 20:53:06.977809 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:06.977780 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9dd9a319-76ec-4631-b200-211f0174ad87","Type":"ContainerStarted","Data":"00ead3e933023f5d1baa14209f69201502d91ade1029da7a500adb40d41b2177"} May 11 20:53:07.360049 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:07.360015 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41ae7e7-b224-431f-92c6-faef8aeb0669" path="/var/lib/kubelet/pods/c41ae7e7-b224-431f-92c6-faef8aeb0669/volumes" May 11 20:53:07.983160 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:07.983119 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" event={"ID":"23302639-969b-4a5a-bd5b-614af2afac30","Type":"ContainerStarted","Data":"adce8bc74147810ab9ee07b2480b38a574075668be8cdce4a5812549dc440349"} May 11 20:53:07.983160 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:07.983166 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" event={"ID":"23302639-969b-4a5a-bd5b-614af2afac30","Type":"ContainerStarted","Data":"3556cfa244c4c3846197b892610e2771feff3896f9eead3a1ef6fb2618f64298"} May 11 20:53:07.983789 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:07.983181 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" event={"ID":"23302639-969b-4a5a-bd5b-614af2afac30","Type":"ContainerStarted","Data":"5699dba2e0a5b81d626247ee4a71f4b9d8491abe6015a7403de0a035bd99354a"} May 11 20:53:07.984460 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:07.984438 2555 generic.go:358] "Generic (PLEG): container finished" podID="9dd9a319-76ec-4631-b200-211f0174ad87" containerID="01f5f56802cac4f3a09fb2707cf7d366b817ce99f7ad31ae08d315d5f98b5f7d" exitCode=0 May 11 20:53:07.984522 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:07.984502 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9dd9a319-76ec-4631-b200-211f0174ad87","Type":"ContainerDied","Data":"01f5f56802cac4f3a09fb2707cf7d366b817ce99f7ad31ae08d315d5f98b5f7d"} May 11 20:53:08.006374 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:08.006290 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7b57cbb8cd-gjvmq" podStartSLOduration=2.255635955 podStartE2EDuration="4.006271731s" podCreationTimestamp="2026-05-11 20:53:04 +0000 UTC" firstStartedPulling="2026-05-11 20:53:05.229179731 +0000 UTC m=+130.441122865" lastFinishedPulling="2026-05-11 20:53:06.979815508 +0000 UTC m=+132.191758641" observedRunningTime="2026-05-11 20:53:08.005070515 +0000 UTC m=+133.217013700" watchObservedRunningTime="2026-05-11 20:53:08.006271731 +0000 UTC m=+133.218214882" May 11 20:53:08.992213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:08.992093 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9dd9a319-76ec-4631-b200-211f0174ad87","Type":"ContainerStarted","Data":"43d7b9843151e8070f676e35cbbc620ae89d43342ee7fed9975659c0ab31f855"} May 11 20:53:08.992213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:08.992140 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9dd9a319-76ec-4631-b200-211f0174ad87","Type":"ContainerStarted","Data":"b7942eec88d531a3c4fc0d5f2dbe94f2651457a12ccdf8da7d508033399a257b"} May 11 20:53:08.992213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:08.992154 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9dd9a319-76ec-4631-b200-211f0174ad87","Type":"ContainerStarted","Data":"056cd3cd97c166fb1dc345706530a3d7bbe5621e48542ae5312957a2fefb5c46"} May 11 20:53:08.992213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:08.992166 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9dd9a319-76ec-4631-b200-211f0174ad87","Type":"ContainerStarted","Data":"fba3916496dbf9fa0d2548569962235d15045547a124d9bf6ca3fa2afc8333ef"} May 11 20:53:08.992213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:08.992180 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9dd9a319-76ec-4631-b200-211f0174ad87","Type":"ContainerStarted","Data":"a98d3034fd726109eb3a9c2f07a73a500ad930a7a697e4e33f74ce0457481dde"} May 11 20:53:08.992213 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:08.992192 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9dd9a319-76ec-4631-b200-211f0174ad87","Type":"ContainerStarted","Data":"bef6aa351b65d41bb9c847fdf3b6405a181b2831454c0e89216312fbc2721309"} May 11 20:53:09.019835 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:09.019775 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.0197575739999998 podStartE2EDuration="3.019757574s" podCreationTimestamp="2026-05-11 20:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:53:09.017186864 +0000 UTC m=+134.229130030" watchObservedRunningTime="2026-05-11 20:53:09.019757574 +0000 UTC m=+134.231700757" May 11 20:53:11.355627 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:53:11.355584 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:54:06.355314 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:06.355273 2555 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:54:06.372728 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:06.372702 2555 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:54:07.175824 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:07.175796 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" May 11 20:54:42.182399 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.182368 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kdvpw"] May 11 20:54:42.185687 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.185666 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.188390 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.188366 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" May 11 20:54:42.201098 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.201065 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kdvpw"] May 11 20:54:42.297611 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.297573 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d88f86a6-6037-4ce7-badc-84d3ac37a7ce-original-pull-secret\") pod \"global-pull-secret-syncer-kdvpw\" (UID: \"d88f86a6-6037-4ce7-badc-84d3ac37a7ce\") " pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.297611 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.297612 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d88f86a6-6037-4ce7-badc-84d3ac37a7ce-dbus\") pod \"global-pull-secret-syncer-kdvpw\" (UID: \"d88f86a6-6037-4ce7-badc-84d3ac37a7ce\") " pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.297831 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.297640 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d88f86a6-6037-4ce7-badc-84d3ac37a7ce-kubelet-config\") pod \"global-pull-secret-syncer-kdvpw\" (UID: \"d88f86a6-6037-4ce7-badc-84d3ac37a7ce\") " pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.399083 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.399046 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d88f86a6-6037-4ce7-badc-84d3ac37a7ce-original-pull-secret\") pod \"global-pull-secret-syncer-kdvpw\" (UID: \"d88f86a6-6037-4ce7-badc-84d3ac37a7ce\") " pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.399083 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.399083 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d88f86a6-6037-4ce7-badc-84d3ac37a7ce-dbus\") pod \"global-pull-secret-syncer-kdvpw\" (UID: \"d88f86a6-6037-4ce7-badc-84d3ac37a7ce\") " pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.399291 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.399104 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d88f86a6-6037-4ce7-badc-84d3ac37a7ce-kubelet-config\") pod \"global-pull-secret-syncer-kdvpw\" (UID: \"d88f86a6-6037-4ce7-badc-84d3ac37a7ce\") " pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.399291 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.399229 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d88f86a6-6037-4ce7-badc-84d3ac37a7ce-kubelet-config\") pod \"global-pull-secret-syncer-kdvpw\" (UID: \"d88f86a6-6037-4ce7-badc-84d3ac37a7ce\") " pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.399291 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.399273 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d88f86a6-6037-4ce7-badc-84d3ac37a7ce-dbus\") pod \"global-pull-secret-syncer-kdvpw\" (UID: \"d88f86a6-6037-4ce7-badc-84d3ac37a7ce\") " pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.401733 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.401709 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d88f86a6-6037-4ce7-badc-84d3ac37a7ce-original-pull-secret\") pod \"global-pull-secret-syncer-kdvpw\" (UID: \"d88f86a6-6037-4ce7-badc-84d3ac37a7ce\") " pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.494301 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.494270 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdvpw" May 11 20:54:42.621208 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:42.621118 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kdvpw"] May 11 20:54:42.624035 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:54:42.624005 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd88f86a6_6037_4ce7_badc_84d3ac37a7ce.slice/crio-7da8d4ce882dc0df001ec65b8c4887b3a7f4dec55ae3fba85fff92a270f09d37 WatchSource:0}: Error finding container 7da8d4ce882dc0df001ec65b8c4887b3a7f4dec55ae3fba85fff92a270f09d37: Status 404 returned error can't find the container with id 7da8d4ce882dc0df001ec65b8c4887b3a7f4dec55ae3fba85fff92a270f09d37 May 11 20:54:43.262958 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:43.262918 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kdvpw" event={"ID":"d88f86a6-6037-4ce7-badc-84d3ac37a7ce","Type":"ContainerStarted","Data":"7da8d4ce882dc0df001ec65b8c4887b3a7f4dec55ae3fba85fff92a270f09d37"} May 11 20:54:47.279937 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:47.279894 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kdvpw" event={"ID":"d88f86a6-6037-4ce7-badc-84d3ac37a7ce","Type":"ContainerStarted","Data":"f3088fb5131e333a15aa5fdc608cbda3b102fc8ad6dc958fde0f436ee2527b4e"} May 11 20:54:47.295145 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:54:47.295097 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kdvpw" podStartSLOduration=1.543828432 podStartE2EDuration="5.29508244s" podCreationTimestamp="2026-05-11 20:54:42 +0000 UTC" firstStartedPulling="2026-05-11 20:54:42.625816887 +0000 UTC m=+227.837760015" lastFinishedPulling="2026-05-11 20:54:46.377070885 +0000 UTC m=+231.589014023" observedRunningTime="2026-05-11 20:54:47.293797529 +0000 UTC m=+232.505740683" watchObservedRunningTime="2026-05-11 20:54:47.29508244 +0000 UTC m=+232.507025635" May 11 20:55:47.911150 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:47.911110 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-854jx"] May 11 20:55:47.913486 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:47.913464 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:55:47.916226 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:47.916208 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" May 11 20:55:47.917255 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:47.917237 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" May 11 20:55:47.917364 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:47.917261 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-jg5wj\"" May 11 20:55:47.923098 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:47.923074 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-854jx"] May 11 20:55:47.948397 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:47.948368 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af9cfff9-3f45-422f-86ef-1b89fbaaafed-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-854jx\" (UID: \"af9cfff9-3f45-422f-86ef-1b89fbaaafed\") " pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:55:47.948536 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:47.948429 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2msk4\" (UniqueName: \"kubernetes.io/projected/af9cfff9-3f45-422f-86ef-1b89fbaaafed-kube-api-access-2msk4\") pod \"cert-manager-webhook-587ccfb98-854jx\" (UID: \"af9cfff9-3f45-422f-86ef-1b89fbaaafed\") " pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:55:48.049483 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:48.049454 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af9cfff9-3f45-422f-86ef-1b89fbaaafed-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-854jx\" (UID: \"af9cfff9-3f45-422f-86ef-1b89fbaaafed\") " pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:55:48.049663 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:48.049497 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2msk4\" (UniqueName: \"kubernetes.io/projected/af9cfff9-3f45-422f-86ef-1b89fbaaafed-kube-api-access-2msk4\") pod \"cert-manager-webhook-587ccfb98-854jx\" (UID: \"af9cfff9-3f45-422f-86ef-1b89fbaaafed\") " pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:55:48.057857 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:48.057823 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af9cfff9-3f45-422f-86ef-1b89fbaaafed-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-854jx\" (UID: \"af9cfff9-3f45-422f-86ef-1b89fbaaafed\") " pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:55:48.057995 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:48.057901 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2msk4\" (UniqueName: \"kubernetes.io/projected/af9cfff9-3f45-422f-86ef-1b89fbaaafed-kube-api-access-2msk4\") pod \"cert-manager-webhook-587ccfb98-854jx\" (UID: \"af9cfff9-3f45-422f-86ef-1b89fbaaafed\") " pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:55:48.241586 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:48.241556 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:55:48.369585 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:48.369554 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-854jx"] May 11 20:55:48.373149 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:55:48.373122 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf9cfff9_3f45_422f_86ef_1b89fbaaafed.slice/crio-08b36e3d827d45b3f945039aefc778473e0886e0f0a58deb377b369b419f1750 WatchSource:0}: Error finding container 08b36e3d827d45b3f945039aefc778473e0886e0f0a58deb377b369b419f1750: Status 404 returned error can't find the container with id 08b36e3d827d45b3f945039aefc778473e0886e0f0a58deb377b369b419f1750 May 11 20:55:48.455185 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:48.455149 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" event={"ID":"af9cfff9-3f45-422f-86ef-1b89fbaaafed","Type":"ContainerStarted","Data":"08b36e3d827d45b3f945039aefc778473e0886e0f0a58deb377b369b419f1750"} May 11 20:55:50.479464 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.479428 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4zrtb"] May 11 20:55:50.486236 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.486190 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" May 11 20:55:50.489687 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.488859 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-h6lhj\"" May 11 20:55:50.489687 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.489142 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4zrtb"] May 11 20:55:50.575685 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.575633 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4412abf-d6a7-4bde-bed9-2dd3dee601c4-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4zrtb\" (UID: \"f4412abf-d6a7-4bde-bed9-2dd3dee601c4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" May 11 20:55:50.575856 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.575727 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv76m\" (UniqueName: \"kubernetes.io/projected/f4412abf-d6a7-4bde-bed9-2dd3dee601c4-kube-api-access-gv76m\") pod \"cert-manager-cainjector-68b757865b-4zrtb\" (UID: \"f4412abf-d6a7-4bde-bed9-2dd3dee601c4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" May 11 20:55:50.676995 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.676964 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv76m\" (UniqueName: \"kubernetes.io/projected/f4412abf-d6a7-4bde-bed9-2dd3dee601c4-kube-api-access-gv76m\") pod \"cert-manager-cainjector-68b757865b-4zrtb\" (UID: \"f4412abf-d6a7-4bde-bed9-2dd3dee601c4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" May 11 20:55:50.677168 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.677054 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4412abf-d6a7-4bde-bed9-2dd3dee601c4-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4zrtb\" (UID: \"f4412abf-d6a7-4bde-bed9-2dd3dee601c4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" May 11 20:55:50.685674 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.685650 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4412abf-d6a7-4bde-bed9-2dd3dee601c4-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4zrtb\" (UID: \"f4412abf-d6a7-4bde-bed9-2dd3dee601c4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" May 11 20:55:50.685811 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.685788 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv76m\" (UniqueName: \"kubernetes.io/projected/f4412abf-d6a7-4bde-bed9-2dd3dee601c4-kube-api-access-gv76m\") pod \"cert-manager-cainjector-68b757865b-4zrtb\" (UID: \"f4412abf-d6a7-4bde-bed9-2dd3dee601c4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" May 11 20:55:50.801095 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:50.801018 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" May 11 20:55:51.901917 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:51.901892 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4zrtb"] May 11 20:55:51.904393 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:55:51.904363 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4412abf_d6a7_4bde_bed9_2dd3dee601c4.slice/crio-a623f72676e13de34a9d9c6a9465609d9072510a95ea8a5f1a0cb046ca339d9e WatchSource:0}: Error finding container a623f72676e13de34a9d9c6a9465609d9072510a95ea8a5f1a0cb046ca339d9e: Status 404 returned error can't find the container with id a623f72676e13de34a9d9c6a9465609d9072510a95ea8a5f1a0cb046ca339d9e May 11 20:55:52.470779 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:52.470747 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" event={"ID":"af9cfff9-3f45-422f-86ef-1b89fbaaafed","Type":"ContainerStarted","Data":"f5fff643732acabf511ad17126ec41d1876b1da96b005ac3adc328ae3e6f5985"} May 11 20:55:52.471032 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:52.470815 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:55:52.472149 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:52.472126 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" event={"ID":"f4412abf-d6a7-4bde-bed9-2dd3dee601c4","Type":"ContainerStarted","Data":"f07b81bb3ab1a2c114310b1d6909799c496932df0de71520e5a6f697150ec97b"} May 11 20:55:52.472219 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:52.472152 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" event={"ID":"f4412abf-d6a7-4bde-bed9-2dd3dee601c4","Type":"ContainerStarted","Data":"a623f72676e13de34a9d9c6a9465609d9072510a95ea8a5f1a0cb046ca339d9e"} May 11 20:55:52.487081 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:52.487039 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" podStartSLOduration=2.061260474 podStartE2EDuration="5.487025465s" podCreationTimestamp="2026-05-11 20:55:47 +0000 UTC" firstStartedPulling="2026-05-11 20:55:48.37528511 +0000 UTC m=+293.587228238" lastFinishedPulling="2026-05-11 20:55:51.801050097 +0000 UTC m=+297.012993229" observedRunningTime="2026-05-11 20:55:52.485541647 +0000 UTC m=+297.697484793" watchObservedRunningTime="2026-05-11 20:55:52.487025465 +0000 UTC m=+297.698968617" May 11 20:55:52.501775 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:52.501737 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-4zrtb" podStartSLOduration=2.501725251 podStartE2EDuration="2.501725251s" podCreationTimestamp="2026-05-11 20:55:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:55:52.499824101 +0000 UTC m=+297.711767262" watchObservedRunningTime="2026-05-11 20:55:52.501725251 +0000 UTC m=+297.713668400" May 11 20:55:55.258587 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:55.258553 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 20:55:55.259017 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:55.258784 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 20:55:55.261198 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:55.261179 2555 kubelet.go:1628] "Image garbage collection succeeded" May 11 20:55:58.477671 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:55:58.477641 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-854jx" May 11 20:56:18.438046 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.437998 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m"] May 11 20:56:18.447478 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.447385 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.450698 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.450656 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" May 11 20:56:18.450872 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.450790 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9zd6v\"" May 11 20:56:18.451073 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.451054 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" May 11 20:56:18.451368 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.451170 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" May 11 20:56:18.451689 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.451669 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" May 11 20:56:18.464997 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.464966 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m"] May 11 20:56:18.622093 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.622043 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5066f764-081f-4e80-adf0-dcdd1bbd305e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-9ww5m\" (UID: \"5066f764-081f-4e80-adf0-dcdd1bbd305e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.622093 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.622096 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5066f764-081f-4e80-adf0-dcdd1bbd305e-webhook-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-9ww5m\" (UID: \"5066f764-081f-4e80-adf0-dcdd1bbd305e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.622311 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.622133 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfh7p\" (UniqueName: \"kubernetes.io/projected/5066f764-081f-4e80-adf0-dcdd1bbd305e-kube-api-access-bfh7p\") pod \"opendatahub-operator-controller-manager-755c95f69f-9ww5m\" (UID: \"5066f764-081f-4e80-adf0-dcdd1bbd305e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.723606 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.723515 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5066f764-081f-4e80-adf0-dcdd1bbd305e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-9ww5m\" (UID: \"5066f764-081f-4e80-adf0-dcdd1bbd305e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.723606 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.723564 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5066f764-081f-4e80-adf0-dcdd1bbd305e-webhook-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-9ww5m\" (UID: \"5066f764-081f-4e80-adf0-dcdd1bbd305e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.723606 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.723601 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfh7p\" (UniqueName: \"kubernetes.io/projected/5066f764-081f-4e80-adf0-dcdd1bbd305e-kube-api-access-bfh7p\") pod \"opendatahub-operator-controller-manager-755c95f69f-9ww5m\" (UID: \"5066f764-081f-4e80-adf0-dcdd1bbd305e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.726117 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.726080 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5066f764-081f-4e80-adf0-dcdd1bbd305e-webhook-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-9ww5m\" (UID: \"5066f764-081f-4e80-adf0-dcdd1bbd305e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.726230 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.726146 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5066f764-081f-4e80-adf0-dcdd1bbd305e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-9ww5m\" (UID: \"5066f764-081f-4e80-adf0-dcdd1bbd305e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.731961 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.731930 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfh7p\" (UniqueName: \"kubernetes.io/projected/5066f764-081f-4e80-adf0-dcdd1bbd305e-kube-api-access-bfh7p\") pod \"opendatahub-operator-controller-manager-755c95f69f-9ww5m\" (UID: \"5066f764-081f-4e80-adf0-dcdd1bbd305e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.766144 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.766106 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:18.901925 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.901898 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m"] May 11 20:56:18.904923 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:56:18.904890 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5066f764_081f_4e80_adf0_dcdd1bbd305e.slice/crio-ee1f7d6b0ac631bcb0b14d8dbacbd1cb2067a73d1a45b531c0b03aa4813167a8 WatchSource:0}: Error finding container ee1f7d6b0ac631bcb0b14d8dbacbd1cb2067a73d1a45b531c0b03aa4813167a8: Status 404 returned error can't find the container with id ee1f7d6b0ac631bcb0b14d8dbacbd1cb2067a73d1a45b531c0b03aa4813167a8 May 11 20:56:18.906497 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:18.906482 2555 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 20:56:19.569428 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:19.569359 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" event={"ID":"5066f764-081f-4e80-adf0-dcdd1bbd305e","Type":"ContainerStarted","Data":"ee1f7d6b0ac631bcb0b14d8dbacbd1cb2067a73d1a45b531c0b03aa4813167a8"} May 11 20:56:21.580347 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:21.580315 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" event={"ID":"5066f764-081f-4e80-adf0-dcdd1bbd305e","Type":"ContainerStarted","Data":"8212fe1f1023f54872f2c343c4b83858bc2d7e4cf07706e60bfe32aec95e752d"} May 11 20:56:21.580735 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:21.580445 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:21.602548 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:21.602494 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" podStartSLOduration=1.14625263 podStartE2EDuration="3.602476744s" podCreationTimestamp="2026-05-11 20:56:18 +0000 UTC" firstStartedPulling="2026-05-11 20:56:18.906608399 +0000 UTC m=+324.118551527" lastFinishedPulling="2026-05-11 20:56:21.362832513 +0000 UTC m=+326.574775641" observedRunningTime="2026-05-11 20:56:21.601099235 +0000 UTC m=+326.813042411" watchObservedRunningTime="2026-05-11 20:56:21.602476744 +0000 UTC m=+326.814419922" May 11 20:56:23.746920 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.746879 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95"] May 11 20:56:23.750671 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.750649 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.754527 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.754473 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" May 11 20:56:23.754527 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.754501 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" May 11 20:56:23.754527 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.754481 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" May 11 20:56:23.754751 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.754544 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-6k2xh\"" May 11 20:56:23.754751 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.754543 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" May 11 20:56:23.754751 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.754572 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" May 11 20:56:23.759914 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.759889 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95"] May 11 20:56:23.767984 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.767953 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bec15022-9690-465c-b505-e8b5172de2ed-cert\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.768065 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.767997 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bec15022-9690-465c-b505-e8b5172de2ed-metrics-cert\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.768065 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.768017 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zf7s\" (UniqueName: \"kubernetes.io/projected/bec15022-9690-465c-b505-e8b5172de2ed-kube-api-access-9zf7s\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.768134 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.768119 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bec15022-9690-465c-b505-e8b5172de2ed-manager-config\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.868745 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.868693 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bec15022-9690-465c-b505-e8b5172de2ed-cert\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.868932 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.868767 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bec15022-9690-465c-b505-e8b5172de2ed-metrics-cert\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.868932 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.868789 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zf7s\" (UniqueName: \"kubernetes.io/projected/bec15022-9690-465c-b505-e8b5172de2ed-kube-api-access-9zf7s\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.868932 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.868880 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bec15022-9690-465c-b505-e8b5172de2ed-manager-config\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.869566 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.869524 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bec15022-9690-465c-b505-e8b5172de2ed-manager-config\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.871348 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.871325 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bec15022-9690-465c-b505-e8b5172de2ed-cert\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.871444 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.871359 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bec15022-9690-465c-b505-e8b5172de2ed-metrics-cert\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:23.879919 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:23.879897 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zf7s\" (UniqueName: \"kubernetes.io/projected/bec15022-9690-465c-b505-e8b5172de2ed-kube-api-access-9zf7s\") pod \"lws-controller-manager-68d9b68cf6-46w95\" (UID: \"bec15022-9690-465c-b505-e8b5172de2ed\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:24.061067 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:24.060977 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:24.208123 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:24.208085 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95"] May 11 20:56:24.217916 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:56:24.217884 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbec15022_9690_465c_b505_e8b5172de2ed.slice/crio-1419140a58da3bbfb14f1b94382e69175ecd0888d01b3b0e5f0e50fcb43834a4 WatchSource:0}: Error finding container 1419140a58da3bbfb14f1b94382e69175ecd0888d01b3b0e5f0e50fcb43834a4: Status 404 returned error can't find the container with id 1419140a58da3bbfb14f1b94382e69175ecd0888d01b3b0e5f0e50fcb43834a4 May 11 20:56:24.592175 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:24.592144 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" event={"ID":"bec15022-9690-465c-b505-e8b5172de2ed","Type":"ContainerStarted","Data":"1419140a58da3bbfb14f1b94382e69175ecd0888d01b3b0e5f0e50fcb43834a4"} May 11 20:56:27.609880 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:27.609842 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" event={"ID":"bec15022-9690-465c-b505-e8b5172de2ed","Type":"ContainerStarted","Data":"d3a31661c19d1bf24ca02277fd145acda78c15c80642bf3b395d6293bc8f84d7"} May 11 20:56:27.610275 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:27.609954 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:27.626592 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:27.626529 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" podStartSLOduration=1.785483527 podStartE2EDuration="4.626515093s" podCreationTimestamp="2026-05-11 20:56:23 +0000 UTC" firstStartedPulling="2026-05-11 20:56:24.219767775 +0000 UTC m=+329.431710906" lastFinishedPulling="2026-05-11 20:56:27.060799333 +0000 UTC m=+332.272742472" observedRunningTime="2026-05-11 20:56:27.625617452 +0000 UTC m=+332.837560638" watchObservedRunningTime="2026-05-11 20:56:27.626515093 +0000 UTC m=+332.838458243" May 11 20:56:32.586028 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:32.586000 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-9ww5m" May 11 20:56:35.473387 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.473356 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2"] May 11 20:56:35.476158 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.476142 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.478739 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.478716 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" May 11 20:56:35.478872 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.478721 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" May 11 20:56:35.479908 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.479887 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-mwf9t\"" May 11 20:56:35.480016 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.479893 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" May 11 20:56:35.480016 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.479934 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" May 11 20:56:35.486193 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.486172 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2"] May 11 20:56:35.576121 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.576086 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qsl\" (UniqueName: \"kubernetes.io/projected/e6a246bf-3141-4770-86e3-df2186820341-kube-api-access-85qsl\") pod \"kube-auth-proxy-c8c9857f9-dq6t2\" (UID: \"e6a246bf-3141-4770-86e3-df2186820341\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.576282 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.576139 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6a246bf-3141-4770-86e3-df2186820341-tmp\") pod \"kube-auth-proxy-c8c9857f9-dq6t2\" (UID: \"e6a246bf-3141-4770-86e3-df2186820341\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.576282 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.576163 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6a246bf-3141-4770-86e3-df2186820341-tls-certs\") pod \"kube-auth-proxy-c8c9857f9-dq6t2\" (UID: \"e6a246bf-3141-4770-86e3-df2186820341\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.677158 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.677124 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85qsl\" (UniqueName: \"kubernetes.io/projected/e6a246bf-3141-4770-86e3-df2186820341-kube-api-access-85qsl\") pod \"kube-auth-proxy-c8c9857f9-dq6t2\" (UID: \"e6a246bf-3141-4770-86e3-df2186820341\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.677337 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.677176 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6a246bf-3141-4770-86e3-df2186820341-tmp\") pod \"kube-auth-proxy-c8c9857f9-dq6t2\" (UID: \"e6a246bf-3141-4770-86e3-df2186820341\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.677337 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.677201 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6a246bf-3141-4770-86e3-df2186820341-tls-certs\") pod \"kube-auth-proxy-c8c9857f9-dq6t2\" (UID: \"e6a246bf-3141-4770-86e3-df2186820341\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.679608 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.679583 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6a246bf-3141-4770-86e3-df2186820341-tmp\") pod \"kube-auth-proxy-c8c9857f9-dq6t2\" (UID: \"e6a246bf-3141-4770-86e3-df2186820341\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.679890 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.679871 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6a246bf-3141-4770-86e3-df2186820341-tls-certs\") pod \"kube-auth-proxy-c8c9857f9-dq6t2\" (UID: \"e6a246bf-3141-4770-86e3-df2186820341\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.685920 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.685896 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qsl\" (UniqueName: \"kubernetes.io/projected/e6a246bf-3141-4770-86e3-df2186820341-kube-api-access-85qsl\") pod \"kube-auth-proxy-c8c9857f9-dq6t2\" (UID: \"e6a246bf-3141-4770-86e3-df2186820341\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.786085 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.786048 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" May 11 20:56:35.915661 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:35.915633 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2"] May 11 20:56:35.918154 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:56:35.918126 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6a246bf_3141_4770_86e3_df2186820341.slice/crio-c35c126d1a6557fc56e4e1d1a79d96997e0ab6a53035e4c241fb847b576df53a WatchSource:0}: Error finding container c35c126d1a6557fc56e4e1d1a79d96997e0ab6a53035e4c241fb847b576df53a: Status 404 returned error can't find the container with id c35c126d1a6557fc56e4e1d1a79d96997e0ab6a53035e4c241fb847b576df53a May 11 20:56:36.647035 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:36.646997 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" event={"ID":"e6a246bf-3141-4770-86e3-df2186820341","Type":"ContainerStarted","Data":"c35c126d1a6557fc56e4e1d1a79d96997e0ab6a53035e4c241fb847b576df53a"} May 11 20:56:38.615557 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:38.615519 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-46w95" May 11 20:56:39.662139 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:39.662099 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" event={"ID":"e6a246bf-3141-4770-86e3-df2186820341","Type":"ContainerStarted","Data":"9ec3ea480fabe105038f9745d391c1c0dfb7f94c0241d72c2026a566890b378c"} May 11 20:56:39.678875 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:56:39.678828 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-dq6t2" podStartSLOduration=1.656609233 podStartE2EDuration="4.678811003s" podCreationTimestamp="2026-05-11 20:56:35 +0000 UTC" firstStartedPulling="2026-05-11 20:56:35.919896548 +0000 UTC m=+341.131839678" lastFinishedPulling="2026-05-11 20:56:38.942098309 +0000 UTC m=+344.154041448" observedRunningTime="2026-05-11 20:56:39.677015818 +0000 UTC m=+344.888958969" watchObservedRunningTime="2026-05-11 20:56:39.678811003 +0000 UTC m=+344.890754160" May 11 20:58:29.847218 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:29.847188 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc"] May 11 20:58:29.850700 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:29.850678 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:29.854435 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:29.853908 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-mb652\"" May 11 20:58:29.854435 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:29.854338 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" May 11 20:58:29.854632 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:29.854501 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" May 11 20:58:29.864081 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:29.864056 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc"] May 11 20:58:29.975304 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:29.975261 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7c7v\" (UniqueName: \"kubernetes.io/projected/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-kube-api-access-p7c7v\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z7hgc\" (UID: \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:29.975517 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:29.975328 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z7hgc\" (UID: \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:30.076426 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:30.076371 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7c7v\" (UniqueName: \"kubernetes.io/projected/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-kube-api-access-p7c7v\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z7hgc\" (UID: \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:30.076603 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:30.076442 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z7hgc\" (UID: \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:30.076811 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:30.076790 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z7hgc\" (UID: \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:30.087733 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:30.087698 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7c7v\" (UniqueName: \"kubernetes.io/projected/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-kube-api-access-p7c7v\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z7hgc\" (UID: \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:30.170046 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:30.169969 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:30.302530 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:30.302496 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc"] May 11 20:58:30.307287 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:58:30.307247 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e356d54_4e4b_4339_aa8a_fb1879d4ffd6.slice/crio-ee75902a54372096570e1078d7198ae8546461485d0207ed56333452d468e016 WatchSource:0}: Error finding container ee75902a54372096570e1078d7198ae8546461485d0207ed56333452d468e016: Status 404 returned error can't find the container with id ee75902a54372096570e1078d7198ae8546461485d0207ed56333452d468e016 May 11 20:58:31.038912 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:31.038868 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" event={"ID":"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6","Type":"ContainerStarted","Data":"ee75902a54372096570e1078d7198ae8546461485d0207ed56333452d468e016"} May 11 20:58:35.055490 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:35.055446 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" event={"ID":"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6","Type":"ContainerStarted","Data":"7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e"} May 11 20:58:35.055901 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:35.055680 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:35.074812 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:35.074757 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" podStartSLOduration=2.082576845 podStartE2EDuration="6.074741656s" podCreationTimestamp="2026-05-11 20:58:29 +0000 UTC" firstStartedPulling="2026-05-11 20:58:30.309337454 +0000 UTC m=+455.521280598" lastFinishedPulling="2026-05-11 20:58:34.301502278 +0000 UTC m=+459.513445409" observedRunningTime="2026-05-11 20:58:35.074546314 +0000 UTC m=+460.286489465" watchObservedRunningTime="2026-05-11 20:58:35.074741656 +0000 UTC m=+460.286684806" May 11 20:58:42.239121 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.239084 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc"] May 11 20:58:42.239566 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.239308 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" podUID="9e356d54-4e4b-4339-aa8a-fb1879d4ffd6" containerName="manager" containerID="cri-o://7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e" gracePeriod=10 May 11 20:58:42.241088 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.241064 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:42.475889 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.475865 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:42.592169 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.592083 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-extensions-socket-volume\") pod \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\" (UID: \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\") " May 11 20:58:42.592336 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.592181 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7c7v\" (UniqueName: \"kubernetes.io/projected/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-kube-api-access-p7c7v\") pod \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\" (UID: \"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6\") " May 11 20:58:42.592609 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.592585 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "9e356d54-4e4b-4339-aa8a-fb1879d4ffd6" (UID: "9e356d54-4e4b-4339-aa8a-fb1879d4ffd6"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:58:42.594447 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.594427 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-kube-api-access-p7c7v" (OuterVolumeSpecName: "kube-api-access-p7c7v") pod "9e356d54-4e4b-4339-aa8a-fb1879d4ffd6" (UID: "9e356d54-4e4b-4339-aa8a-fb1879d4ffd6"). InnerVolumeSpecName "kube-api-access-p7c7v". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:58:42.693364 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.693323 2555 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-extensions-socket-volume\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:58:42.693364 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:42.693358 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7c7v\" (UniqueName: \"kubernetes.io/projected/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6-kube-api-access-p7c7v\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:58:43.082673 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.082632 2555 generic.go:358] "Generic (PLEG): container finished" podID="9e356d54-4e4b-4339-aa8a-fb1879d4ffd6" containerID="7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e" exitCode=0 May 11 20:58:43.082885 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.082704 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" event={"ID":"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6","Type":"ContainerDied","Data":"7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e"} May 11 20:58:43.082885 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.082728 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" May 11 20:58:43.082885 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.082752 2555 scope.go:117] "RemoveContainer" containerID="7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e" May 11 20:58:43.082885 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.082740 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc" event={"ID":"9e356d54-4e4b-4339-aa8a-fb1879d4ffd6","Type":"ContainerDied","Data":"ee75902a54372096570e1078d7198ae8546461485d0207ed56333452d468e016"} May 11 20:58:43.091390 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.091368 2555 scope.go:117] "RemoveContainer" containerID="7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e" May 11 20:58:43.091702 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:58:43.091683 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e\": container with ID starting with 7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e not found: ID does not exist" containerID="7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e" May 11 20:58:43.091752 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.091712 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e"} err="failed to get container status \"7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e\": rpc error: code = NotFound desc = could not find container \"7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e\": container with ID starting with 7b7bdf9e60dd78e8c3a64fe9988ad672aab008142ebf3d64dfcde1689d989d2e not found: ID does not exist" May 11 20:58:43.105506 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.105474 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc"] May 11 20:58:43.121666 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.121630 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z7hgc"] May 11 20:58:43.360316 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:58:43.360223 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e356d54-4e4b-4339-aa8a-fb1879d4ffd6" path="/var/lib/kubelet/pods/9e356d54-4e4b-4339-aa8a-fb1879d4ffd6/volumes" May 11 20:59:03.325878 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.325835 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6b9mm"] May 11 20:59:03.326483 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.326464 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e356d54-4e4b-4339-aa8a-fb1879d4ffd6" containerName="manager" May 11 20:59:03.326582 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.326486 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e356d54-4e4b-4339-aa8a-fb1879d4ffd6" containerName="manager" May 11 20:59:03.326635 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.326596 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e356d54-4e4b-4339-aa8a-fb1879d4ffd6" containerName="manager" May 11 20:59:03.331370 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.331343 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:03.334073 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.334027 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" May 11 20:59:03.334073 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.334031 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" May 11 20:59:03.335103 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.335053 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" May 11 20:59:03.335240 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.335058 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8ksxs\"" May 11 20:59:03.340639 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.340615 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6b9mm"] May 11 20:59:03.363691 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.363662 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2a09633b-58e4-4cbc-8849-3b250044889a-config-file\") pod \"limitador-limitador-7d549b5b-6b9mm\" (UID: \"2a09633b-58e4-4cbc-8849-3b250044889a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:03.363860 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.363697 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rzs\" (UniqueName: \"kubernetes.io/projected/2a09633b-58e4-4cbc-8849-3b250044889a-kube-api-access-92rzs\") pod \"limitador-limitador-7d549b5b-6b9mm\" (UID: \"2a09633b-58e4-4cbc-8849-3b250044889a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:03.434559 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.434523 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6b9mm"] May 11 20:59:03.464159 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.464134 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2a09633b-58e4-4cbc-8849-3b250044889a-config-file\") pod \"limitador-limitador-7d549b5b-6b9mm\" (UID: \"2a09633b-58e4-4cbc-8849-3b250044889a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:03.464300 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.464166 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92rzs\" (UniqueName: \"kubernetes.io/projected/2a09633b-58e4-4cbc-8849-3b250044889a-kube-api-access-92rzs\") pod \"limitador-limitador-7d549b5b-6b9mm\" (UID: \"2a09633b-58e4-4cbc-8849-3b250044889a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:03.464778 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.464758 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2a09633b-58e4-4cbc-8849-3b250044889a-config-file\") pod \"limitador-limitador-7d549b5b-6b9mm\" (UID: \"2a09633b-58e4-4cbc-8849-3b250044889a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:03.472551 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.472527 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rzs\" (UniqueName: \"kubernetes.io/projected/2a09633b-58e4-4cbc-8849-3b250044889a-kube-api-access-92rzs\") pod \"limitador-limitador-7d549b5b-6b9mm\" (UID: \"2a09633b-58e4-4cbc-8849-3b250044889a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:03.643881 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.643789 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:03.777820 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:03.777796 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6b9mm"] May 11 20:59:03.780005 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:59:03.779981 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a09633b_58e4_4cbc_8849_3b250044889a.slice/crio-b9bec8724582cedc238bb49aac3e6eaa03985f492d650843833e9f018671728b WatchSource:0}: Error finding container b9bec8724582cedc238bb49aac3e6eaa03985f492d650843833e9f018671728b: Status 404 returned error can't find the container with id b9bec8724582cedc238bb49aac3e6eaa03985f492d650843833e9f018671728b May 11 20:59:04.075389 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.075352 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-5nfb6"] May 11 20:59:04.079962 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.079940 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-5nfb6" May 11 20:59:04.082884 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.082860 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-k2glf\"" May 11 20:59:04.083202 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.083182 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-5nfb6"] May 11 20:59:04.156135 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.156100 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" event={"ID":"2a09633b-58e4-4cbc-8849-3b250044889a","Type":"ContainerStarted","Data":"b9bec8724582cedc238bb49aac3e6eaa03985f492d650843833e9f018671728b"} May 11 20:59:04.172696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.172663 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxtfk\" (UniqueName: \"kubernetes.io/projected/f8a8acc2-b247-404c-a03a-4dc0af8f2912-kube-api-access-wxtfk\") pod \"authorino-7498df8756-5nfb6\" (UID: \"f8a8acc2-b247-404c-a03a-4dc0af8f2912\") " pod="kuadrant-system/authorino-7498df8756-5nfb6" May 11 20:59:04.274093 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.274063 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxtfk\" (UniqueName: \"kubernetes.io/projected/f8a8acc2-b247-404c-a03a-4dc0af8f2912-kube-api-access-wxtfk\") pod \"authorino-7498df8756-5nfb6\" (UID: \"f8a8acc2-b247-404c-a03a-4dc0af8f2912\") " pod="kuadrant-system/authorino-7498df8756-5nfb6" May 11 20:59:04.284029 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.284000 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxtfk\" (UniqueName: \"kubernetes.io/projected/f8a8acc2-b247-404c-a03a-4dc0af8f2912-kube-api-access-wxtfk\") pod \"authorino-7498df8756-5nfb6\" (UID: \"f8a8acc2-b247-404c-a03a-4dc0af8f2912\") " pod="kuadrant-system/authorino-7498df8756-5nfb6" May 11 20:59:04.391849 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.391759 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-5nfb6" May 11 20:59:04.561837 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:04.561794 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-5nfb6"] May 11 20:59:05.162733 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:05.162659 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-5nfb6" event={"ID":"f8a8acc2-b247-404c-a03a-4dc0af8f2912","Type":"ContainerStarted","Data":"2434646d9309046361b7cb64a25f60ca01a32a7bdef643a12bb82159157ddc7d"} May 11 20:59:09.185545 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:09.185503 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" event={"ID":"2a09633b-58e4-4cbc-8849-3b250044889a","Type":"ContainerStarted","Data":"0497466cb1a29138eeeeadcb2ca31d474eb127cc699511c91abc6e62f73049de"} May 11 20:59:09.185971 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:09.185625 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:09.186894 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:09.186869 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-5nfb6" event={"ID":"f8a8acc2-b247-404c-a03a-4dc0af8f2912","Type":"ContainerStarted","Data":"f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09"} May 11 20:59:09.203642 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:09.203594 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" podStartSLOduration=1.289599229 podStartE2EDuration="6.203581049s" podCreationTimestamp="2026-05-11 20:59:03 +0000 UTC" firstStartedPulling="2026-05-11 20:59:03.781770055 +0000 UTC m=+488.993713183" lastFinishedPulling="2026-05-11 20:59:08.695751872 +0000 UTC m=+493.907695003" observedRunningTime="2026-05-11 20:59:09.202581895 +0000 UTC m=+494.414525046" watchObservedRunningTime="2026-05-11 20:59:09.203581049 +0000 UTC m=+494.415524198" May 11 20:59:09.218131 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:09.218083 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-5nfb6" podStartSLOduration=1.090527413 podStartE2EDuration="5.218070139s" podCreationTimestamp="2026-05-11 20:59:04 +0000 UTC" firstStartedPulling="2026-05-11 20:59:04.569282222 +0000 UTC m=+489.781225357" lastFinishedPulling="2026-05-11 20:59:08.696824956 +0000 UTC m=+493.908768083" observedRunningTime="2026-05-11 20:59:09.217552353 +0000 UTC m=+494.429495514" watchObservedRunningTime="2026-05-11 20:59:09.218070139 +0000 UTC m=+494.430013288" May 11 20:59:18.382241 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:18.382190 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6b9mm"] May 11 20:59:18.382727 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:18.382492 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" podUID="2a09633b-58e4-4cbc-8849-3b250044889a" containerName="limitador" containerID="cri-o://0497466cb1a29138eeeeadcb2ca31d474eb127cc699511c91abc6e62f73049de" gracePeriod=30 May 11 20:59:18.383166 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:18.383136 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:19.221611 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:19.221573 2555 generic.go:358] "Generic (PLEG): container finished" podID="2a09633b-58e4-4cbc-8849-3b250044889a" containerID="0497466cb1a29138eeeeadcb2ca31d474eb127cc699511c91abc6e62f73049de" exitCode=0 May 11 20:59:19.221769 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:19.221647 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" event={"ID":"2a09633b-58e4-4cbc-8849-3b250044889a","Type":"ContainerDied","Data":"0497466cb1a29138eeeeadcb2ca31d474eb127cc699511c91abc6e62f73049de"} May 11 20:59:19.321674 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:19.321652 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:19.410687 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:19.410660 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92rzs\" (UniqueName: \"kubernetes.io/projected/2a09633b-58e4-4cbc-8849-3b250044889a-kube-api-access-92rzs\") pod \"2a09633b-58e4-4cbc-8849-3b250044889a\" (UID: \"2a09633b-58e4-4cbc-8849-3b250044889a\") " May 11 20:59:19.411114 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:19.410693 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2a09633b-58e4-4cbc-8849-3b250044889a-config-file\") pod \"2a09633b-58e4-4cbc-8849-3b250044889a\" (UID: \"2a09633b-58e4-4cbc-8849-3b250044889a\") " May 11 20:59:19.411181 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:19.411125 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a09633b-58e4-4cbc-8849-3b250044889a-config-file" (OuterVolumeSpecName: "config-file") pod "2a09633b-58e4-4cbc-8849-3b250044889a" (UID: "2a09633b-58e4-4cbc-8849-3b250044889a"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:59:19.412839 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:19.412807 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a09633b-58e4-4cbc-8849-3b250044889a-kube-api-access-92rzs" (OuterVolumeSpecName: "kube-api-access-92rzs") pod "2a09633b-58e4-4cbc-8849-3b250044889a" (UID: "2a09633b-58e4-4cbc-8849-3b250044889a"). InnerVolumeSpecName "kube-api-access-92rzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:59:19.511521 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:19.511482 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-92rzs\" (UniqueName: \"kubernetes.io/projected/2a09633b-58e4-4cbc-8849-3b250044889a-kube-api-access-92rzs\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:59:19.511521 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:19.511519 2555 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2a09633b-58e4-4cbc-8849-3b250044889a-config-file\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:59:20.225719 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:20.225687 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" event={"ID":"2a09633b-58e4-4cbc-8849-3b250044889a","Type":"ContainerDied","Data":"b9bec8724582cedc238bb49aac3e6eaa03985f492d650843833e9f018671728b"} May 11 20:59:20.225719 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:20.225705 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6b9mm" May 11 20:59:20.225980 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:20.225731 2555 scope.go:117] "RemoveContainer" containerID="0497466cb1a29138eeeeadcb2ca31d474eb127cc699511c91abc6e62f73049de" May 11 20:59:20.247148 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:20.247111 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6b9mm"] May 11 20:59:20.252312 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:20.252289 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6b9mm"] May 11 20:59:21.359783 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:21.359746 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a09633b-58e4-4cbc-8849-3b250044889a" path="/var/lib/kubelet/pods/2a09633b-58e4-4cbc-8849-3b250044889a/volumes" May 11 20:59:24.073670 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.073637 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-68c4fbbd6f-cgqs2"] May 11 20:59:24.074196 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.074177 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a09633b-58e4-4cbc-8849-3b250044889a" containerName="limitador" May 11 20:59:24.074233 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.074202 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a09633b-58e4-4cbc-8849-3b250044889a" containerName="limitador" May 11 20:59:24.074342 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.074330 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a09633b-58e4-4cbc-8849-3b250044889a" containerName="limitador" May 11 20:59:24.078989 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.078967 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:24.082011 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.081991 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" May 11 20:59:24.082716 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.082695 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-p57s9\"" May 11 20:59:24.084447 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.084392 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-68c4fbbd6f-cgqs2"] May 11 20:59:24.157891 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.157860 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwh7s\" (UniqueName: \"kubernetes.io/projected/d1afc4aa-b707-4933-87fe-0f4f14784bc4-kube-api-access-mwh7s\") pod \"postgres-68c4fbbd6f-cgqs2\" (UID: \"d1afc4aa-b707-4933-87fe-0f4f14784bc4\") " pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:24.158043 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.157911 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d1afc4aa-b707-4933-87fe-0f4f14784bc4-data\") pod \"postgres-68c4fbbd6f-cgqs2\" (UID: \"d1afc4aa-b707-4933-87fe-0f4f14784bc4\") " pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:24.258731 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.258698 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwh7s\" (UniqueName: \"kubernetes.io/projected/d1afc4aa-b707-4933-87fe-0f4f14784bc4-kube-api-access-mwh7s\") pod \"postgres-68c4fbbd6f-cgqs2\" (UID: \"d1afc4aa-b707-4933-87fe-0f4f14784bc4\") " pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:24.258907 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.258746 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d1afc4aa-b707-4933-87fe-0f4f14784bc4-data\") pod \"postgres-68c4fbbd6f-cgqs2\" (UID: \"d1afc4aa-b707-4933-87fe-0f4f14784bc4\") " pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:24.259057 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.259041 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d1afc4aa-b707-4933-87fe-0f4f14784bc4-data\") pod \"postgres-68c4fbbd6f-cgqs2\" (UID: \"d1afc4aa-b707-4933-87fe-0f4f14784bc4\") " pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:24.267871 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.267849 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwh7s\" (UniqueName: \"kubernetes.io/projected/d1afc4aa-b707-4933-87fe-0f4f14784bc4-kube-api-access-mwh7s\") pod \"postgres-68c4fbbd6f-cgqs2\" (UID: \"d1afc4aa-b707-4933-87fe-0f4f14784bc4\") " pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:24.390582 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.390491 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:24.518957 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:24.518922 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-68c4fbbd6f-cgqs2"] May 11 20:59:24.521748 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:59:24.521715 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1afc4aa_b707_4933_87fe_0f4f14784bc4.slice/crio-803ad2535171cf165876beba5156f45f22b0080ff5640be104656c3e7f31cb9c WatchSource:0}: Error finding container 803ad2535171cf165876beba5156f45f22b0080ff5640be104656c3e7f31cb9c: Status 404 returned error can't find the container with id 803ad2535171cf165876beba5156f45f22b0080ff5640be104656c3e7f31cb9c May 11 20:59:25.246323 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:25.246160 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-68c4fbbd6f-cgqs2" event={"ID":"d1afc4aa-b707-4933-87fe-0f4f14784bc4","Type":"ContainerStarted","Data":"803ad2535171cf165876beba5156f45f22b0080ff5640be104656c3e7f31cb9c"} May 11 20:59:30.264936 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:30.264890 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-68c4fbbd6f-cgqs2" event={"ID":"d1afc4aa-b707-4933-87fe-0f4f14784bc4","Type":"ContainerStarted","Data":"28bc5dded54e1a2c72d08b68170d8ba2b07b98edb2e9a7b04e26876bd1417240"} May 11 20:59:30.265317 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:30.264973 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:30.280155 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:30.280101 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-68c4fbbd6f-cgqs2" podStartSLOduration=1.583315367 podStartE2EDuration="6.280082843s" podCreationTimestamp="2026-05-11 20:59:24 +0000 UTC" firstStartedPulling="2026-05-11 20:59:24.523090891 +0000 UTC m=+509.735034019" lastFinishedPulling="2026-05-11 20:59:29.219858366 +0000 UTC m=+514.431801495" observedRunningTime="2026-05-11 20:59:30.279625683 +0000 UTC m=+515.491568834" watchObservedRunningTime="2026-05-11 20:59:30.280082843 +0000 UTC m=+515.492025992" May 11 20:59:36.297425 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:36.297384 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-68c4fbbd6f-cgqs2" May 11 20:59:38.223315 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.223280 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-5nfb6"] May 11 20:59:38.223701 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.223535 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-5nfb6" podUID="f8a8acc2-b247-404c-a03a-4dc0af8f2912" containerName="authorino" containerID="cri-o://f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09" gracePeriod=30 May 11 20:59:38.468051 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.468022 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-5nfb6" May 11 20:59:38.490832 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.490796 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxtfk\" (UniqueName: \"kubernetes.io/projected/f8a8acc2-b247-404c-a03a-4dc0af8f2912-kube-api-access-wxtfk\") pod \"f8a8acc2-b247-404c-a03a-4dc0af8f2912\" (UID: \"f8a8acc2-b247-404c-a03a-4dc0af8f2912\") " May 11 20:59:38.493172 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.493146 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a8acc2-b247-404c-a03a-4dc0af8f2912-kube-api-access-wxtfk" (OuterVolumeSpecName: "kube-api-access-wxtfk") pod "f8a8acc2-b247-404c-a03a-4dc0af8f2912" (UID: "f8a8acc2-b247-404c-a03a-4dc0af8f2912"). InnerVolumeSpecName "kube-api-access-wxtfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:59:38.592098 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.592067 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wxtfk\" (UniqueName: \"kubernetes.io/projected/f8a8acc2-b247-404c-a03a-4dc0af8f2912-kube-api-access-wxtfk\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:59:38.982080 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.982049 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5cbb6cc4d-sdsg8"] May 11 20:59:38.982546 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.982523 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8a8acc2-b247-404c-a03a-4dc0af8f2912" containerName="authorino" May 11 20:59:38.982546 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.982544 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a8acc2-b247-404c-a03a-4dc0af8f2912" containerName="authorino" May 11 20:59:38.982699 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:38.982633 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8a8acc2-b247-404c-a03a-4dc0af8f2912" containerName="authorino" May 11 20:59:39.028877 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.028838 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5cbb6cc4d-sdsg8"] May 11 20:59:39.029027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.028981 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" May 11 20:59:39.032093 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.032064 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" May 11 20:59:39.032214 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.032117 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-dqcql\"" May 11 20:59:39.096468 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.096396 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzwnn\" (UniqueName: \"kubernetes.io/projected/73890efd-5785-40c4-8990-87198e16c3fd-kube-api-access-vzwnn\") pod \"maas-controller-5cbb6cc4d-sdsg8\" (UID: \"73890efd-5785-40c4-8990-87198e16c3fd\") " pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" May 11 20:59:39.128130 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.128099 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5f7f7d78f-22s9h"] May 11 20:59:39.140475 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.140445 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5f7f7d78f-22s9h"] May 11 20:59:39.140629 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.140552 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" May 11 20:59:39.196984 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.196951 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx6kf\" (UniqueName: \"kubernetes.io/projected/eb077f18-6425-47c0-abd5-c96d1a05d940-kube-api-access-bx6kf\") pod \"maas-controller-5f7f7d78f-22s9h\" (UID: \"eb077f18-6425-47c0-abd5-c96d1a05d940\") " pod="opendatahub/maas-controller-5f7f7d78f-22s9h" May 11 20:59:39.197132 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.197017 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzwnn\" (UniqueName: \"kubernetes.io/projected/73890efd-5785-40c4-8990-87198e16c3fd-kube-api-access-vzwnn\") pod \"maas-controller-5cbb6cc4d-sdsg8\" (UID: \"73890efd-5785-40c4-8990-87198e16c3fd\") " pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" May 11 20:59:39.205570 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.205539 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzwnn\" (UniqueName: \"kubernetes.io/projected/73890efd-5785-40c4-8990-87198e16c3fd-kube-api-access-vzwnn\") pod \"maas-controller-5cbb6cc4d-sdsg8\" (UID: \"73890efd-5785-40c4-8990-87198e16c3fd\") " pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" May 11 20:59:39.294886 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.294808 2555 generic.go:358] "Generic (PLEG): container finished" podID="f8a8acc2-b247-404c-a03a-4dc0af8f2912" containerID="f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09" exitCode=0 May 11 20:59:39.294886 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.294860 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-5nfb6" May 11 20:59:39.295335 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.294889 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-5nfb6" event={"ID":"f8a8acc2-b247-404c-a03a-4dc0af8f2912","Type":"ContainerDied","Data":"f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09"} May 11 20:59:39.295335 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.294927 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-5nfb6" event={"ID":"f8a8acc2-b247-404c-a03a-4dc0af8f2912","Type":"ContainerDied","Data":"2434646d9309046361b7cb64a25f60ca01a32a7bdef643a12bb82159157ddc7d"} May 11 20:59:39.295335 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.294945 2555 scope.go:117] "RemoveContainer" containerID="f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09" May 11 20:59:39.297655 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.297623 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx6kf\" (UniqueName: \"kubernetes.io/projected/eb077f18-6425-47c0-abd5-c96d1a05d940-kube-api-access-bx6kf\") pod \"maas-controller-5f7f7d78f-22s9h\" (UID: \"eb077f18-6425-47c0-abd5-c96d1a05d940\") " pod="opendatahub/maas-controller-5f7f7d78f-22s9h" May 11 20:59:39.303603 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.303572 2555 scope.go:117] "RemoveContainer" containerID="f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09" May 11 20:59:39.303868 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:59:39.303847 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09\": container with ID starting with f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09 not found: ID does not exist" containerID="f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09" May 11 20:59:39.303951 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.303875 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09"} err="failed to get container status \"f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09\": rpc error: code = NotFound desc = could not find container \"f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09\": container with ID starting with f3f12a36fa73cac6685016f3521906d27616b8d733512c4d3ae96cc6fc837a09 not found: ID does not exist" May 11 20:59:39.317462 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.317437 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx6kf\" (UniqueName: \"kubernetes.io/projected/eb077f18-6425-47c0-abd5-c96d1a05d940-kube-api-access-bx6kf\") pod \"maas-controller-5f7f7d78f-22s9h\" (UID: \"eb077f18-6425-47c0-abd5-c96d1a05d940\") " pod="opendatahub/maas-controller-5f7f7d78f-22s9h" May 11 20:59:39.328866 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.328844 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-5nfb6"] May 11 20:59:39.339937 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.339913 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5cbb6cc4d-sdsg8"] May 11 20:59:39.340065 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.339976 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" May 11 20:59:39.342329 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.342312 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-5nfb6"] May 11 20:59:39.360898 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.360866 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a8acc2-b247-404c-a03a-4dc0af8f2912" path="/var/lib/kubelet/pods/f8a8acc2-b247-404c-a03a-4dc0af8f2912/volumes" May 11 20:59:39.394792 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.394759 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-65cb7fdc6c-2jd58"] May 11 20:59:39.436458 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.436428 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-65cb7fdc6c-2jd58"] May 11 20:59:39.436803 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.436651 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" May 11 20:59:39.454639 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.454614 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" May 11 20:59:39.506577 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.505943 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk48q\" (UniqueName: \"kubernetes.io/projected/5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1-kube-api-access-kk48q\") pod \"maas-controller-65cb7fdc6c-2jd58\" (UID: \"5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1\") " pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" May 11 20:59:39.509750 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.509721 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5cbb6cc4d-sdsg8"] May 11 20:59:39.607482 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.607435 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kk48q\" (UniqueName: \"kubernetes.io/projected/5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1-kube-api-access-kk48q\") pod \"maas-controller-65cb7fdc6c-2jd58\" (UID: \"5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1\") " pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" May 11 20:59:39.617018 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.616988 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk48q\" (UniqueName: \"kubernetes.io/projected/5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1-kube-api-access-kk48q\") pod \"maas-controller-65cb7fdc6c-2jd58\" (UID: \"5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1\") " pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" May 11 20:59:39.624072 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.624048 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5f7f7d78f-22s9h"] May 11 20:59:39.626175 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:59:39.626146 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb077f18_6425_47c0_abd5_c96d1a05d940.slice/crio-c7726a9dc70d12c84ca765d20ef03f69b4c815bfcb0d0fc642187b174cfa3dca WatchSource:0}: Error finding container c7726a9dc70d12c84ca765d20ef03f69b4c815bfcb0d0fc642187b174cfa3dca: Status 404 returned error can't find the container with id c7726a9dc70d12c84ca765d20ef03f69b4c815bfcb0d0fc642187b174cfa3dca May 11 20:59:39.758885 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.758852 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" May 11 20:59:39.888951 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:39.888928 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-65cb7fdc6c-2jd58"] May 11 20:59:39.891097 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:59:39.891062 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5981cb3a_1a83_4c16_8eb5_cfac19b8aeb1.slice/crio-67f17e6e02152378e8c4417cbc978edfd25b043370105efe403b1cfe641b93d6 WatchSource:0}: Error finding container 67f17e6e02152378e8c4417cbc978edfd25b043370105efe403b1cfe641b93d6: Status 404 returned error can't find the container with id 67f17e6e02152378e8c4417cbc978edfd25b043370105efe403b1cfe641b93d6 May 11 20:59:40.301672 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:40.301597 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" event={"ID":"5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1","Type":"ContainerStarted","Data":"67f17e6e02152378e8c4417cbc978edfd25b043370105efe403b1cfe641b93d6"} May 11 20:59:40.304296 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:40.304242 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" event={"ID":"73890efd-5785-40c4-8990-87198e16c3fd","Type":"ContainerStarted","Data":"dc4299aa0cf5e89896c152693798b5b08f2b4aadbf18883a0adc6e390202e4fb"} May 11 20:59:40.305578 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:40.305541 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" event={"ID":"eb077f18-6425-47c0-abd5-c96d1a05d940","Type":"ContainerStarted","Data":"c7726a9dc70d12c84ca765d20ef03f69b4c815bfcb0d0fc642187b174cfa3dca"} May 11 20:59:44.325209 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.325177 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" event={"ID":"5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1","Type":"ContainerStarted","Data":"c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376"} May 11 20:59:44.325707 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.325301 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" May 11 20:59:44.326666 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.326641 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" event={"ID":"73890efd-5785-40c4-8990-87198e16c3fd","Type":"ContainerStarted","Data":"d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761"} May 11 20:59:44.326769 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.326687 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" podUID="73890efd-5785-40c4-8990-87198e16c3fd" containerName="manager" containerID="cri-o://d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761" gracePeriod=600 May 11 20:59:44.326769 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.326703 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" May 11 20:59:44.327972 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.327946 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" event={"ID":"eb077f18-6425-47c0-abd5-c96d1a05d940","Type":"ContainerStarted","Data":"4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95"} May 11 20:59:44.328072 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.328059 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" May 11 20:59:44.342366 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.342318 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" podStartSLOduration=1.605024923 podStartE2EDuration="5.342304799s" podCreationTimestamp="2026-05-11 20:59:39 +0000 UTC" firstStartedPulling="2026-05-11 20:59:39.892365239 +0000 UTC m=+525.104308367" lastFinishedPulling="2026-05-11 20:59:43.629645107 +0000 UTC m=+528.841588243" observedRunningTime="2026-05-11 20:59:44.341207216 +0000 UTC m=+529.553150366" watchObservedRunningTime="2026-05-11 20:59:44.342304799 +0000 UTC m=+529.554247939" May 11 20:59:44.357730 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.357680 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" podStartSLOduration=2.577073147 podStartE2EDuration="6.357663671s" podCreationTimestamp="2026-05-11 20:59:38 +0000 UTC" firstStartedPulling="2026-05-11 20:59:39.507518355 +0000 UTC m=+524.719461487" lastFinishedPulling="2026-05-11 20:59:43.288108875 +0000 UTC m=+528.500052011" observedRunningTime="2026-05-11 20:59:44.355746049 +0000 UTC m=+529.567689229" watchObservedRunningTime="2026-05-11 20:59:44.357663671 +0000 UTC m=+529.569606821" May 11 20:59:44.372427 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.372355 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" podStartSLOduration=1.378765328 podStartE2EDuration="5.372336122s" podCreationTimestamp="2026-05-11 20:59:39 +0000 UTC" firstStartedPulling="2026-05-11 20:59:39.627533899 +0000 UTC m=+524.839477028" lastFinishedPulling="2026-05-11 20:59:43.621104678 +0000 UTC m=+528.833047822" observedRunningTime="2026-05-11 20:59:44.370917422 +0000 UTC m=+529.582860572" watchObservedRunningTime="2026-05-11 20:59:44.372336122 +0000 UTC m=+529.584279271" May 11 20:59:44.565583 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.565560 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" May 11 20:59:44.653755 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.653659 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzwnn\" (UniqueName: \"kubernetes.io/projected/73890efd-5785-40c4-8990-87198e16c3fd-kube-api-access-vzwnn\") pod \"73890efd-5785-40c4-8990-87198e16c3fd\" (UID: \"73890efd-5785-40c4-8990-87198e16c3fd\") " May 11 20:59:44.656027 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.655996 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73890efd-5785-40c4-8990-87198e16c3fd-kube-api-access-vzwnn" (OuterVolumeSpecName: "kube-api-access-vzwnn") pod "73890efd-5785-40c4-8990-87198e16c3fd" (UID: "73890efd-5785-40c4-8990-87198e16c3fd"). InnerVolumeSpecName "kube-api-access-vzwnn". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:59:44.754280 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:44.754242 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vzwnn\" (UniqueName: \"kubernetes.io/projected/73890efd-5785-40c4-8990-87198e16c3fd-kube-api-access-vzwnn\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:59:45.332365 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:45.332326 2555 generic.go:358] "Generic (PLEG): container finished" podID="73890efd-5785-40c4-8990-87198e16c3fd" containerID="d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761" exitCode=0 May 11 20:59:45.332867 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:45.332398 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" May 11 20:59:45.332867 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:45.332422 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" event={"ID":"73890efd-5785-40c4-8990-87198e16c3fd","Type":"ContainerDied","Data":"d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761"} May 11 20:59:45.332867 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:45.332478 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5cbb6cc4d-sdsg8" event={"ID":"73890efd-5785-40c4-8990-87198e16c3fd","Type":"ContainerDied","Data":"dc4299aa0cf5e89896c152693798b5b08f2b4aadbf18883a0adc6e390202e4fb"} May 11 20:59:45.332867 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:45.332505 2555 scope.go:117] "RemoveContainer" containerID="d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761" May 11 20:59:45.340720 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:45.340704 2555 scope.go:117] "RemoveContainer" containerID="d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761" May 11 20:59:45.340958 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:59:45.340940 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761\": container with ID starting with d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761 not found: ID does not exist" containerID="d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761" May 11 20:59:45.341002 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:45.340966 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761"} err="failed to get container status \"d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761\": rpc error: code = NotFound desc = could not find container \"d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761\": container with ID starting with d225ede5450545785806620abb5e57a70f3d871cb789df49f197be4083522761 not found: ID does not exist" May 11 20:59:45.354241 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:45.354214 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5cbb6cc4d-sdsg8"] May 11 20:59:45.366069 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:45.366038 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5cbb6cc4d-sdsg8"] May 11 20:59:47.361203 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:47.361167 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73890efd-5785-40c4-8990-87198e16c3fd" path="/var/lib/kubelet/pods/73890efd-5785-40c4-8990-87198e16c3fd/volumes" May 11 20:59:55.337899 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.337862 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" May 11 20:59:55.338508 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.338478 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" May 11 20:59:55.386635 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.386601 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5f7f7d78f-22s9h"] May 11 20:59:55.386847 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.386825 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" podUID="eb077f18-6425-47c0-abd5-c96d1a05d940" containerName="manager" containerID="cri-o://4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95" gracePeriod=600 May 11 20:59:55.565531 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.565498 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-99bcb79cc-2wrnm"] May 11 20:59:55.566142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.566120 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73890efd-5785-40c4-8990-87198e16c3fd" containerName="manager" May 11 20:59:55.566142 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.566141 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="73890efd-5785-40c4-8990-87198e16c3fd" containerName="manager" May 11 20:59:55.566326 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.566252 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="73890efd-5785-40c4-8990-87198e16c3fd" containerName="manager" May 11 20:59:55.569157 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.569137 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" May 11 20:59:55.578991 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.578963 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-99bcb79cc-2wrnm"] May 11 20:59:55.633696 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.633676 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" May 11 20:59:55.657868 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.657839 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpd6z\" (UniqueName: \"kubernetes.io/projected/93e0aecf-1f29-4e9e-b614-144118dc567c-kube-api-access-dpd6z\") pod \"maas-controller-99bcb79cc-2wrnm\" (UID: \"93e0aecf-1f29-4e9e-b614-144118dc567c\") " pod="opendatahub/maas-controller-99bcb79cc-2wrnm" May 11 20:59:55.759131 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.759092 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx6kf\" (UniqueName: \"kubernetes.io/projected/eb077f18-6425-47c0-abd5-c96d1a05d940-kube-api-access-bx6kf\") pod \"eb077f18-6425-47c0-abd5-c96d1a05d940\" (UID: \"eb077f18-6425-47c0-abd5-c96d1a05d940\") " May 11 20:59:55.759318 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.759304 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd6z\" (UniqueName: \"kubernetes.io/projected/93e0aecf-1f29-4e9e-b614-144118dc567c-kube-api-access-dpd6z\") pod \"maas-controller-99bcb79cc-2wrnm\" (UID: \"93e0aecf-1f29-4e9e-b614-144118dc567c\") " pod="opendatahub/maas-controller-99bcb79cc-2wrnm" May 11 20:59:55.761359 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.761327 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb077f18-6425-47c0-abd5-c96d1a05d940-kube-api-access-bx6kf" (OuterVolumeSpecName: "kube-api-access-bx6kf") pod "eb077f18-6425-47c0-abd5-c96d1a05d940" (UID: "eb077f18-6425-47c0-abd5-c96d1a05d940"). InnerVolumeSpecName "kube-api-access-bx6kf". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:59:55.768816 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.768796 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpd6z\" (UniqueName: \"kubernetes.io/projected/93e0aecf-1f29-4e9e-b614-144118dc567c-kube-api-access-dpd6z\") pod \"maas-controller-99bcb79cc-2wrnm\" (UID: \"93e0aecf-1f29-4e9e-b614-144118dc567c\") " pod="opendatahub/maas-controller-99bcb79cc-2wrnm" May 11 20:59:55.860693 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.860606 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bx6kf\" (UniqueName: \"kubernetes.io/projected/eb077f18-6425-47c0-abd5-c96d1a05d940-kube-api-access-bx6kf\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 20:59:55.881485 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:55.881458 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" May 11 20:59:56.013909 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.013879 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-99bcb79cc-2wrnm"] May 11 20:59:56.015929 ip-10-0-133-205 kubenswrapper[2555]: W0511 20:59:56.015890 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93e0aecf_1f29_4e9e_b614_144118dc567c.slice/crio-9ccaf44dc308f1331e8385417fa03d99c1fa74ea4e947454698c9e4554ff245b WatchSource:0}: Error finding container 9ccaf44dc308f1331e8385417fa03d99c1fa74ea4e947454698c9e4554ff245b: Status 404 returned error can't find the container with id 9ccaf44dc308f1331e8385417fa03d99c1fa74ea4e947454698c9e4554ff245b May 11 20:59:56.372586 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.372551 2555 generic.go:358] "Generic (PLEG): container finished" podID="eb077f18-6425-47c0-abd5-c96d1a05d940" containerID="4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95" exitCode=0 May 11 20:59:56.372979 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.372618 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" event={"ID":"eb077f18-6425-47c0-abd5-c96d1a05d940","Type":"ContainerDied","Data":"4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95"} May 11 20:59:56.372979 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.372645 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" May 11 20:59:56.372979 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.372659 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5f7f7d78f-22s9h" event={"ID":"eb077f18-6425-47c0-abd5-c96d1a05d940","Type":"ContainerDied","Data":"c7726a9dc70d12c84ca765d20ef03f69b4c815bfcb0d0fc642187b174cfa3dca"} May 11 20:59:56.372979 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.372676 2555 scope.go:117] "RemoveContainer" containerID="4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95" May 11 20:59:56.373940 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.373900 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" event={"ID":"93e0aecf-1f29-4e9e-b614-144118dc567c","Type":"ContainerStarted","Data":"9ccaf44dc308f1331e8385417fa03d99c1fa74ea4e947454698c9e4554ff245b"} May 11 20:59:56.381392 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.381367 2555 scope.go:117] "RemoveContainer" containerID="4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95" May 11 20:59:56.381760 ip-10-0-133-205 kubenswrapper[2555]: E0511 20:59:56.381735 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95\": container with ID starting with 4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95 not found: ID does not exist" containerID="4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95" May 11 20:59:56.381848 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.381767 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95"} err="failed to get container status \"4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95\": rpc error: code = NotFound desc = could not find container \"4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95\": container with ID starting with 4f1309c4eaa79e7f2d3906de95c283cfc51ab9a50bb5972edd37d3546d1f6a95 not found: ID does not exist" May 11 20:59:56.395306 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.395277 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5f7f7d78f-22s9h"] May 11 20:59:56.396874 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:56.396850 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5f7f7d78f-22s9h"] May 11 20:59:57.361103 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:57.361071 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb077f18-6425-47c0-abd5-c96d1a05d940" path="/var/lib/kubelet/pods/eb077f18-6425-47c0-abd5-c96d1a05d940/volumes" May 11 20:59:57.379474 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:57.379435 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" event={"ID":"93e0aecf-1f29-4e9e-b614-144118dc567c","Type":"ContainerStarted","Data":"a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34"} May 11 20:59:57.379906 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:57.379577 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" May 11 20:59:57.396482 ip-10-0-133-205 kubenswrapper[2555]: I0511 20:59:57.396433 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" podStartSLOduration=1.979430147 podStartE2EDuration="2.396396554s" podCreationTimestamp="2026-05-11 20:59:55 +0000 UTC" firstStartedPulling="2026-05-11 20:59:56.017231078 +0000 UTC m=+541.229174210" lastFinishedPulling="2026-05-11 20:59:56.434197489 +0000 UTC m=+541.646140617" observedRunningTime="2026-05-11 20:59:57.39589524 +0000 UTC m=+542.607838405" watchObservedRunningTime="2026-05-11 20:59:57.396396554 +0000 UTC m=+542.608339704" May 11 21:00:04.013658 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.013619 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-65d9855595-vsf4j"] May 11 21:00:04.014048 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.014004 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb077f18-6425-47c0-abd5-c96d1a05d940" containerName="manager" May 11 21:00:04.014048 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.014014 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb077f18-6425-47c0-abd5-c96d1a05d940" containerName="manager" May 11 21:00:04.014125 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.014090 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb077f18-6425-47c0-abd5-c96d1a05d940" containerName="manager" May 11 21:00:04.017382 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.017362 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:04.020017 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.019997 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-s4nmf\"" May 11 21:00:04.020086 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.019997 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" May 11 21:00:04.029151 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.029130 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-65d9855595-vsf4j"] May 11 21:00:04.136289 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.136260 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-maas-api-tls\") pod \"maas-api-65d9855595-vsf4j\" (UID: \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\") " pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:04.136484 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.136324 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl9dz\" (UniqueName: \"kubernetes.io/projected/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-kube-api-access-kl9dz\") pod \"maas-api-65d9855595-vsf4j\" (UID: \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\") " pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:04.237487 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.237449 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl9dz\" (UniqueName: \"kubernetes.io/projected/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-kube-api-access-kl9dz\") pod \"maas-api-65d9855595-vsf4j\" (UID: \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\") " pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:04.237670 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.237537 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-maas-api-tls\") pod \"maas-api-65d9855595-vsf4j\" (UID: \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\") " pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:04.240265 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.240240 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-maas-api-tls\") pod \"maas-api-65d9855595-vsf4j\" (UID: \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\") " pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:04.254765 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.254736 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl9dz\" (UniqueName: \"kubernetes.io/projected/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-kube-api-access-kl9dz\") pod \"maas-api-65d9855595-vsf4j\" (UID: \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\") " pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:04.328165 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.328088 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:04.423239 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.423208 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/payload-processing-58b6c8fdc7-5jz94"] May 11 21:00:04.428510 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.428485 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" May 11 21:00:04.431325 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.431298 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"payload-processing-plugins\"" May 11 21:00:04.434871 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.434841 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/payload-processing-58b6c8fdc7-5jz94"] May 11 21:00:04.473330 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.473305 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-65d9855595-vsf4j"] May 11 21:00:04.475375 ip-10-0-133-205 kubenswrapper[2555]: W0511 21:00:04.475349 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf875d3e1_9420_4b54_9c40_8e2dd93ac6d4.slice/crio-37b01a8f61205ea2e7d13519a1525e98e18d4117bdc7b5a4f2c9df20cdc67192 WatchSource:0}: Error finding container 37b01a8f61205ea2e7d13519a1525e98e18d4117bdc7b5a4f2c9df20cdc67192: Status 404 returned error can't find the container with id 37b01a8f61205ea2e7d13519a1525e98e18d4117bdc7b5a4f2c9df20cdc67192 May 11 21:00:04.540127 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.540085 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gwdt\" (UniqueName: \"kubernetes.io/projected/236ec381-63c0-4952-9f9a-c19f3959f837-kube-api-access-5gwdt\") pod \"payload-processing-58b6c8fdc7-5jz94\" (UID: \"236ec381-63c0-4952-9f9a-c19f3959f837\") " pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" May 11 21:00:04.641567 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.641473 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gwdt\" (UniqueName: \"kubernetes.io/projected/236ec381-63c0-4952-9f9a-c19f3959f837-kube-api-access-5gwdt\") pod \"payload-processing-58b6c8fdc7-5jz94\" (UID: \"236ec381-63c0-4952-9f9a-c19f3959f837\") " pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" May 11 21:00:04.655113 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.655076 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gwdt\" (UniqueName: \"kubernetes.io/projected/236ec381-63c0-4952-9f9a-c19f3959f837-kube-api-access-5gwdt\") pod \"payload-processing-58b6c8fdc7-5jz94\" (UID: \"236ec381-63c0-4952-9f9a-c19f3959f837\") " pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" May 11 21:00:04.741006 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.740970 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" May 11 21:00:04.874247 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:04.874222 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/payload-processing-58b6c8fdc7-5jz94"] May 11 21:00:04.876489 ip-10-0-133-205 kubenswrapper[2555]: W0511 21:00:04.876456 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod236ec381_63c0_4952_9f9a_c19f3959f837.slice/crio-476dae7b9c64ee56664a7836b25e31d4100584f3017beb5e31f5a0455cd3e56b WatchSource:0}: Error finding container 476dae7b9c64ee56664a7836b25e31d4100584f3017beb5e31f5a0455cd3e56b: Status 404 returned error can't find the container with id 476dae7b9c64ee56664a7836b25e31d4100584f3017beb5e31f5a0455cd3e56b May 11 21:00:05.414558 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:05.414519 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" event={"ID":"236ec381-63c0-4952-9f9a-c19f3959f837","Type":"ContainerStarted","Data":"476dae7b9c64ee56664a7836b25e31d4100584f3017beb5e31f5a0455cd3e56b"} May 11 21:00:05.415765 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:05.415731 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65d9855595-vsf4j" event={"ID":"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4","Type":"ContainerStarted","Data":"37b01a8f61205ea2e7d13519a1525e98e18d4117bdc7b5a4f2c9df20cdc67192"} May 11 21:00:06.421848 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:06.421798 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65d9855595-vsf4j" event={"ID":"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4","Type":"ContainerStarted","Data":"c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43"} May 11 21:00:06.422310 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:06.422061 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:06.439361 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:06.439297 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-65d9855595-vsf4j" podStartSLOduration=2.231490997 podStartE2EDuration="3.439278148s" podCreationTimestamp="2026-05-11 21:00:03 +0000 UTC" firstStartedPulling="2026-05-11 21:00:04.476736775 +0000 UTC m=+549.688679908" lastFinishedPulling="2026-05-11 21:00:05.684523929 +0000 UTC m=+550.896467059" observedRunningTime="2026-05-11 21:00:06.436804799 +0000 UTC m=+551.648747950" watchObservedRunningTime="2026-05-11 21:00:06.439278148 +0000 UTC m=+551.651221300" May 11 21:00:08.387920 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.387891 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" May 11 21:00:08.432021 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.431992 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-65cb7fdc6c-2jd58"] May 11 21:00:08.432278 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.432249 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" podUID="5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1" containerName="manager" containerID="cri-o://c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376" gracePeriod=600 May 11 21:00:08.432460 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.432321 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" event={"ID":"236ec381-63c0-4952-9f9a-c19f3959f837","Type":"ContainerStarted","Data":"e05287c580f3533a90adb3370408eff4351945ce377e070e98b60cfc44bab09b"} May 11 21:00:08.432460 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.432391 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" May 11 21:00:08.459699 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.459653 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" podStartSLOduration=1.84614532 podStartE2EDuration="4.459638772s" podCreationTimestamp="2026-05-11 21:00:04 +0000 UTC" firstStartedPulling="2026-05-11 21:00:04.87826299 +0000 UTC m=+550.090206117" lastFinishedPulling="2026-05-11 21:00:07.491756439 +0000 UTC m=+552.703699569" observedRunningTime="2026-05-11 21:00:08.458349428 +0000 UTC m=+553.670292575" watchObservedRunningTime="2026-05-11 21:00:08.459638772 +0000 UTC m=+553.671581922" May 11 21:00:08.679383 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.679354 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" May 11 21:00:08.682598 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.682564 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk48q\" (UniqueName: \"kubernetes.io/projected/5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1-kube-api-access-kk48q\") pod \"5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1\" (UID: \"5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1\") " May 11 21:00:08.684741 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.684707 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1-kube-api-access-kk48q" (OuterVolumeSpecName: "kube-api-access-kk48q") pod "5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1" (UID: "5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1"). InnerVolumeSpecName "kube-api-access-kk48q". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 21:00:08.783192 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:08.783147 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kk48q\" (UniqueName: \"kubernetes.io/projected/5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1-kube-api-access-kk48q\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 21:00:09.436390 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:09.436352 2555 generic.go:358] "Generic (PLEG): container finished" podID="5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1" containerID="c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376" exitCode=0 May 11 21:00:09.436884 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:09.436434 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" May 11 21:00:09.436884 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:09.436445 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" event={"ID":"5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1","Type":"ContainerDied","Data":"c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376"} May 11 21:00:09.436884 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:09.436491 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-65cb7fdc6c-2jd58" event={"ID":"5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1","Type":"ContainerDied","Data":"67f17e6e02152378e8c4417cbc978edfd25b043370105efe403b1cfe641b93d6"} May 11 21:00:09.436884 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:09.436515 2555 scope.go:117] "RemoveContainer" containerID="c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376" May 11 21:00:09.445231 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:09.445213 2555 scope.go:117] "RemoveContainer" containerID="c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376" May 11 21:00:09.445535 ip-10-0-133-205 kubenswrapper[2555]: E0511 21:00:09.445508 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376\": container with ID starting with c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376 not found: ID does not exist" containerID="c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376" May 11 21:00:09.445596 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:09.445544 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376"} err="failed to get container status \"c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376\": rpc error: code = NotFound desc = could not find container \"c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376\": container with ID starting with c8e921110fe608fbf820c49c74039ded4d3fe2c7c7a45616e2411647ab3e4376 not found: ID does not exist" May 11 21:00:09.454198 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:09.454171 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-65cb7fdc6c-2jd58"] May 11 21:00:09.462308 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:09.460124 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-65cb7fdc6c-2jd58"] May 11 21:00:11.359861 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:11.359830 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1" path="/var/lib/kubelet/pods/5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1/volumes" May 11 21:00:12.431815 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:12.431757 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:19.439602 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:19.439569 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/payload-processing-58b6c8fdc7-5jz94" May 11 21:00:26.432456 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.432392 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7"] May 11 21:00:26.432956 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.432940 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1" containerName="manager" May 11 21:00:26.433012 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.432959 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1" containerName="manager" May 11 21:00:26.433052 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.433036 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="5981cb3a-1a83-4c16-8eb5-cfac19b8aeb1" containerName="manager" May 11 21:00:26.439875 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.439853 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.444670 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.444645 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" May 11 21:00:26.444670 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.444661 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" May 11 21:00:26.444858 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.444717 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-cbxdf\"" May 11 21:00:26.444858 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.444645 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" May 11 21:00:26.461396 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.461368 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7"] May 11 21:00:26.535465 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.535432 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.535635 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.535486 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.535635 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.535520 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.535635 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.535547 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchxl\" (UniqueName: \"kubernetes.io/projected/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-kube-api-access-jchxl\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.535735 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.535672 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.535735 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.535715 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.636359 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.636310 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.636359 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.636358 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.636627 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.636386 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.636627 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.636454 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.636627 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.636476 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.636627 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.636506 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jchxl\" (UniqueName: \"kubernetes.io/projected/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-kube-api-access-jchxl\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.636817 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.636789 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.636875 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.636817 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.636930 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.636910 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.638873 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.638847 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.639069 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.639052 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.644867 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.644847 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchxl\" (UniqueName: \"kubernetes.io/projected/5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a-kube-api-access-jchxl\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-ptcc7\" (UID: \"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.749585 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.749545 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:26.887328 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:26.887298 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7"] May 11 21:00:26.889661 ip-10-0-133-205 kubenswrapper[2555]: W0511 21:00:26.889634 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dfe4ea7_8d30_4f67_88c6_1f3eac534c8a.slice/crio-b873f8859047bbeee7fdba1b6f2f4f8a133c1abba2b0ba0fa2a21569b38d89e5 WatchSource:0}: Error finding container b873f8859047bbeee7fdba1b6f2f4f8a133c1abba2b0ba0fa2a21569b38d89e5: Status 404 returned error can't find the container with id b873f8859047bbeee7fdba1b6f2f4f8a133c1abba2b0ba0fa2a21569b38d89e5 May 11 21:00:27.505494 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:27.505453 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" event={"ID":"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a","Type":"ContainerStarted","Data":"b873f8859047bbeee7fdba1b6f2f4f8a133c1abba2b0ba0fa2a21569b38d89e5"} May 11 21:00:29.764696 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:29.764656 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-79c9b695dc-rrv9h"] May 11 21:00:29.781165 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:29.781136 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-79c9b695dc-rrv9h"] May 11 21:00:29.781337 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:29.781197 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:29.869327 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:29.869292 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a9527d82-772e-4c86-9a25-65807b9c013a-maas-api-tls\") pod \"maas-api-79c9b695dc-rrv9h\" (UID: \"a9527d82-772e-4c86-9a25-65807b9c013a\") " pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:29.869550 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:29.869346 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2wz\" (UniqueName: \"kubernetes.io/projected/a9527d82-772e-4c86-9a25-65807b9c013a-kube-api-access-6f2wz\") pod \"maas-api-79c9b695dc-rrv9h\" (UID: \"a9527d82-772e-4c86-9a25-65807b9c013a\") " pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:29.971115 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:29.971073 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a9527d82-772e-4c86-9a25-65807b9c013a-maas-api-tls\") pod \"maas-api-79c9b695dc-rrv9h\" (UID: \"a9527d82-772e-4c86-9a25-65807b9c013a\") " pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:29.971298 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:29.971124 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2wz\" (UniqueName: \"kubernetes.io/projected/a9527d82-772e-4c86-9a25-65807b9c013a-kube-api-access-6f2wz\") pod \"maas-api-79c9b695dc-rrv9h\" (UID: \"a9527d82-772e-4c86-9a25-65807b9c013a\") " pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:29.974210 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:29.974183 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a9527d82-772e-4c86-9a25-65807b9c013a-maas-api-tls\") pod \"maas-api-79c9b695dc-rrv9h\" (UID: \"a9527d82-772e-4c86-9a25-65807b9c013a\") " pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:29.980950 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:29.980917 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2wz\" (UniqueName: \"kubernetes.io/projected/a9527d82-772e-4c86-9a25-65807b9c013a-kube-api-access-6f2wz\") pod \"maas-api-79c9b695dc-rrv9h\" (UID: \"a9527d82-772e-4c86-9a25-65807b9c013a\") " pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:30.095628 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:30.095543 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:32.104627 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:32.104602 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-79c9b695dc-rrv9h"] May 11 21:00:32.107417 ip-10-0-133-205 kubenswrapper[2555]: W0511 21:00:32.107370 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9527d82_772e_4c86_9a25_65807b9c013a.slice/crio-5b3acaf7547f3b97b71a2bb24200651ea5f26e4400884e3116d0dfc40741cc40 WatchSource:0}: Error finding container 5b3acaf7547f3b97b71a2bb24200651ea5f26e4400884e3116d0dfc40741cc40: Status 404 returned error can't find the container with id 5b3acaf7547f3b97b71a2bb24200651ea5f26e4400884e3116d0dfc40741cc40 May 11 21:00:32.528157 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:32.528113 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" event={"ID":"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a","Type":"ContainerStarted","Data":"ccb919f7c71935dd627b9a03d6b2d3ec71c142694fafdf2ec7427d082bb2eebc"} May 11 21:00:32.529275 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:32.529250 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79c9b695dc-rrv9h" event={"ID":"a9527d82-772e-4c86-9a25-65807b9c013a","Type":"ContainerStarted","Data":"5b3acaf7547f3b97b71a2bb24200651ea5f26e4400884e3116d0dfc40741cc40"} May 11 21:00:34.543095 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.543057 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79c9b695dc-rrv9h" event={"ID":"a9527d82-772e-4c86-9a25-65807b9c013a","Type":"ContainerStarted","Data":"b52f2c97d95606c402bb8b271e3f1f986ded16b485a6455d4fb2ade7473ab42a"} May 11 21:00:34.543491 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.543285 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:34.563506 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.563452 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-79c9b695dc-rrv9h" podStartSLOduration=3.936462761 podStartE2EDuration="5.56342857s" podCreationTimestamp="2026-05-11 21:00:29 +0000 UTC" firstStartedPulling="2026-05-11 21:00:32.108890152 +0000 UTC m=+577.320833280" lastFinishedPulling="2026-05-11 21:00:33.735855958 +0000 UTC m=+578.947799089" observedRunningTime="2026-05-11 21:00:34.562864372 +0000 UTC m=+579.774807522" watchObservedRunningTime="2026-05-11 21:00:34.56342857 +0000 UTC m=+579.775371721" May 11 21:00:34.772446 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.772384 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m"] May 11 21:00:34.776468 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.776441 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:34.779267 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.779245 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" May 11 21:00:34.788607 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.788581 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m"] May 11 21:00:34.925221 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.925127 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:34.925221 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.925174 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:34.925502 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.925252 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08c7721c-d61c-485c-a11d-f7dcd8467fa1-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:34.925502 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.925311 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:34.925502 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.925386 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:34.925502 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:34.925480 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc7tc\" (UniqueName: \"kubernetes.io/projected/08c7721c-d61c-485c-a11d-f7dcd8467fa1-kube-api-access-qc7tc\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.026201 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.026166 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08c7721c-d61c-485c-a11d-f7dcd8467fa1-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.026372 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.026222 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.026372 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.026276 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.026372 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.026332 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc7tc\" (UniqueName: \"kubernetes.io/projected/08c7721c-d61c-485c-a11d-f7dcd8467fa1-kube-api-access-qc7tc\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.026560 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.026371 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.026560 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.026434 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.026857 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.026831 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.026960 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.026860 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.026960 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.026936 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.028929 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.028910 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08c7721c-d61c-485c-a11d-f7dcd8467fa1-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.029077 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.029060 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08c7721c-d61c-485c-a11d-f7dcd8467fa1-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.039433 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.035892 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc7tc\" (UniqueName: \"kubernetes.io/projected/08c7721c-d61c-485c-a11d-f7dcd8467fa1-kube-api-access-qc7tc\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m\" (UID: \"08c7721c-d61c-485c-a11d-f7dcd8467fa1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.089526 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.089488 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:00:35.244636 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.244608 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m"] May 11 21:00:35.246692 ip-10-0-133-205 kubenswrapper[2555]: W0511 21:00:35.246658 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08c7721c_d61c_485c_a11d_f7dcd8467fa1.slice/crio-ed42e63ec99f990d07253f4ea283428ba71b9c5ec35231e8b3bedef4e931a7b2 WatchSource:0}: Error finding container ed42e63ec99f990d07253f4ea283428ba71b9c5ec35231e8b3bedef4e931a7b2: Status 404 returned error can't find the container with id ed42e63ec99f990d07253f4ea283428ba71b9c5ec35231e8b3bedef4e931a7b2 May 11 21:00:35.548936 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.548894 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" event={"ID":"08c7721c-d61c-485c-a11d-f7dcd8467fa1","Type":"ContainerStarted","Data":"4948cdf0936a4aaa7b63a04bc6305030c9bc2558a68bcecbb897d67c1a659e2d"} May 11 21:00:35.549346 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:35.548947 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" event={"ID":"08c7721c-d61c-485c-a11d-f7dcd8467fa1","Type":"ContainerStarted","Data":"ed42e63ec99f990d07253f4ea283428ba71b9c5ec35231e8b3bedef4e931a7b2"} May 11 21:00:38.561298 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:38.561261 2555 generic.go:358] "Generic (PLEG): container finished" podID="5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a" containerID="ccb919f7c71935dd627b9a03d6b2d3ec71c142694fafdf2ec7427d082bb2eebc" exitCode=0 May 11 21:00:38.561701 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:38.561341 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" event={"ID":"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a","Type":"ContainerDied","Data":"ccb919f7c71935dd627b9a03d6b2d3ec71c142694fafdf2ec7427d082bb2eebc"} May 11 21:00:40.554692 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:40.554664 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-79c9b695dc-rrv9h" May 11 21:00:40.598121 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:40.598083 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-65d9855595-vsf4j"] May 11 21:00:40.598399 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:40.598374 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-65d9855595-vsf4j" podUID="f875d3e1-9420-4b54-9c40-8e2dd93ac6d4" containerName="maas-api" containerID="cri-o://c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43" gracePeriod=30 May 11 21:00:40.855819 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:40.855794 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:40.983305 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:40.983271 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-maas-api-tls\") pod \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\" (UID: \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\") " May 11 21:00:40.983489 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:40.983388 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl9dz\" (UniqueName: \"kubernetes.io/projected/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-kube-api-access-kl9dz\") pod \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\" (UID: \"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4\") " May 11 21:00:40.985465 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:40.985430 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "f875d3e1-9420-4b54-9c40-8e2dd93ac6d4" (UID: "f875d3e1-9420-4b54-9c40-8e2dd93ac6d4"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 21:00:40.985569 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:40.985528 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-kube-api-access-kl9dz" (OuterVolumeSpecName: "kube-api-access-kl9dz") pod "f875d3e1-9420-4b54-9c40-8e2dd93ac6d4" (UID: "f875d3e1-9420-4b54-9c40-8e2dd93ac6d4"). InnerVolumeSpecName "kube-api-access-kl9dz". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 21:00:41.084298 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.084241 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kl9dz\" (UniqueName: \"kubernetes.io/projected/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-kube-api-access-kl9dz\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 21:00:41.084298 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.084288 2555 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4-maas-api-tls\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 21:00:41.579300 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.579264 2555 generic.go:358] "Generic (PLEG): container finished" podID="08c7721c-d61c-485c-a11d-f7dcd8467fa1" containerID="4948cdf0936a4aaa7b63a04bc6305030c9bc2558a68bcecbb897d67c1a659e2d" exitCode=0 May 11 21:00:41.579749 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.579339 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" event={"ID":"08c7721c-d61c-485c-a11d-f7dcd8467fa1","Type":"ContainerDied","Data":"4948cdf0936a4aaa7b63a04bc6305030c9bc2558a68bcecbb897d67c1a659e2d"} May 11 21:00:41.580603 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.580576 2555 generic.go:358] "Generic (PLEG): container finished" podID="f875d3e1-9420-4b54-9c40-8e2dd93ac6d4" containerID="c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43" exitCode=0 May 11 21:00:41.580762 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.580614 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65d9855595-vsf4j" event={"ID":"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4","Type":"ContainerDied","Data":"c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43"} May 11 21:00:41.580762 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.580639 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65d9855595-vsf4j" event={"ID":"f875d3e1-9420-4b54-9c40-8e2dd93ac6d4","Type":"ContainerDied","Data":"37b01a8f61205ea2e7d13519a1525e98e18d4117bdc7b5a4f2c9df20cdc67192"} May 11 21:00:41.580762 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.580658 2555 scope.go:117] "RemoveContainer" containerID="c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43" May 11 21:00:41.580762 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.580662 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65d9855595-vsf4j" May 11 21:00:41.589012 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.588998 2555 scope.go:117] "RemoveContainer" containerID="c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43" May 11 21:00:41.589259 ip-10-0-133-205 kubenswrapper[2555]: E0511 21:00:41.589240 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43\": container with ID starting with c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43 not found: ID does not exist" containerID="c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43" May 11 21:00:41.589329 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.589271 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43"} err="failed to get container status \"c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43\": rpc error: code = NotFound desc = could not find container \"c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43\": container with ID starting with c7c3a477d3a7378c2180cd4def6f1a3912d838d330c476fd5a60cf536beded43 not found: ID does not exist" May 11 21:00:41.611928 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.611899 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-65d9855595-vsf4j"] May 11 21:00:41.615230 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:41.615197 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-65d9855595-vsf4j"] May 11 21:00:43.360447 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:43.360390 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f875d3e1-9420-4b54-9c40-8e2dd93ac6d4" path="/var/lib/kubelet/pods/f875d3e1-9420-4b54-9c40-8e2dd93ac6d4/volumes" May 11 21:00:47.671277 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.671239 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc"] May 11 21:00:47.671718 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.671675 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f875d3e1-9420-4b54-9c40-8e2dd93ac6d4" containerName="maas-api" May 11 21:00:47.671718 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.671691 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f875d3e1-9420-4b54-9c40-8e2dd93ac6d4" containerName="maas-api" May 11 21:00:47.671838 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.671798 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="f875d3e1-9420-4b54-9c40-8e2dd93ac6d4" containerName="maas-api" May 11 21:00:47.678022 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.678004 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.680978 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.680955 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" May 11 21:00:47.688925 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.688895 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc"] May 11 21:00:47.746029 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.745986 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.746213 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.746106 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.746213 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.746147 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.746213 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.746167 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx2wd\" (UniqueName: \"kubernetes.io/projected/fc0b7649-4863-4c52-ab61-eb1b49e051a0-kube-api-access-sx2wd\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.746213 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.746199 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc0b7649-4863-4c52-ab61-eb1b49e051a0-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.746444 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.746314 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.846931 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.846884 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.847108 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.846944 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sx2wd\" (UniqueName: \"kubernetes.io/projected/fc0b7649-4863-4c52-ab61-eb1b49e051a0-kube-api-access-sx2wd\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.847108 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.846984 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc0b7649-4863-4c52-ab61-eb1b49e051a0-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.847234 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.847130 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.847234 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.847222 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.847366 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.847342 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.847459 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.847387 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.847603 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.847578 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.847737 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.847710 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.849649 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.849620 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc0b7649-4863-4c52-ab61-eb1b49e051a0-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.849768 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.849746 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc0b7649-4863-4c52-ab61-eb1b49e051a0-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.857713 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.857676 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx2wd\" (UniqueName: \"kubernetes.io/projected/fc0b7649-4863-4c52-ab61-eb1b49e051a0-kube-api-access-sx2wd\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc\" (UID: \"fc0b7649-4863-4c52-ab61-eb1b49e051a0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:47.991531 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:47.991497 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:00:48.147253 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:48.147220 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc"] May 11 21:00:48.289557 ip-10-0-133-205 kubenswrapper[2555]: W0511 21:00:48.289481 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0b7649_4863_4c52_ab61_eb1b49e051a0.slice/crio-f983bcff38292cfed4070fafa9f4a2c271e8359a2dbaef914e9eb8e534a3b630 WatchSource:0}: Error finding container f983bcff38292cfed4070fafa9f4a2c271e8359a2dbaef914e9eb8e534a3b630: Status 404 returned error can't find the container with id f983bcff38292cfed4070fafa9f4a2c271e8359a2dbaef914e9eb8e534a3b630 May 11 21:00:48.607819 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:48.607728 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" event={"ID":"fc0b7649-4863-4c52-ab61-eb1b49e051a0","Type":"ContainerStarted","Data":"8622c6ec92747810ff788f9c5c34e84fdb9a5fe2e0b1ac66f8698ed52606b62a"} May 11 21:00:48.607819 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:48.607778 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" event={"ID":"fc0b7649-4863-4c52-ab61-eb1b49e051a0","Type":"ContainerStarted","Data":"f983bcff38292cfed4070fafa9f4a2c271e8359a2dbaef914e9eb8e534a3b630"} May 11 21:00:51.622976 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:51.622940 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" event={"ID":"5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a","Type":"ContainerStarted","Data":"bd1d3cc3b3d507f56575e08c18a1a879c269f862dd60a596b3f1ce60a4e3723b"} May 11 21:00:51.623434 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:51.623174 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:00:51.643988 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:51.643929 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" podStartSLOduration=1.6457710479999998 podStartE2EDuration="25.643912249s" podCreationTimestamp="2026-05-11 21:00:26 +0000 UTC" firstStartedPulling="2026-05-11 21:00:26.891541554 +0000 UTC m=+572.103484695" lastFinishedPulling="2026-05-11 21:00:50.889682754 +0000 UTC m=+596.101625896" observedRunningTime="2026-05-11 21:00:51.642703261 +0000 UTC m=+596.854646413" watchObservedRunningTime="2026-05-11 21:00:51.643912249 +0000 UTC m=+596.855855399" May 11 21:00:52.628659 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:52.628619 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" event={"ID":"08c7721c-d61c-485c-a11d-f7dcd8467fa1","Type":"ContainerStarted","Data":"ddf6348925b54e684562915804e48fd8f8f48bd0bc35b09e4d6e14ffe29bbf3a"} May 11 21:00:52.715105 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:52.715053 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" podStartSLOduration=8.287996653 podStartE2EDuration="18.715038886s" podCreationTimestamp="2026-05-11 21:00:34 +0000 UTC" firstStartedPulling="2026-05-11 21:00:41.580085283 +0000 UTC m=+586.792028411" lastFinishedPulling="2026-05-11 21:00:52.007127512 +0000 UTC m=+597.219070644" observedRunningTime="2026-05-11 21:00:52.71165955 +0000 UTC m=+597.923602699" watchObservedRunningTime="2026-05-11 21:00:52.715038886 +0000 UTC m=+597.926982036" May 11 21:00:54.637457 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:54.637423 2555 generic.go:358] "Generic (PLEG): container finished" podID="fc0b7649-4863-4c52-ab61-eb1b49e051a0" containerID="8622c6ec92747810ff788f9c5c34e84fdb9a5fe2e0b1ac66f8698ed52606b62a" exitCode=0 May 11 21:00:54.637885 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:54.637473 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" event={"ID":"fc0b7649-4863-4c52-ab61-eb1b49e051a0","Type":"ContainerDied","Data":"8622c6ec92747810ff788f9c5c34e84fdb9a5fe2e0b1ac66f8698ed52606b62a"} May 11 21:00:55.287223 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:55.287191 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:00:55.287764 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:00:55.287744 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:01:01.666386 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:01:01.666346 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" event={"ID":"fc0b7649-4863-4c52-ab61-eb1b49e051a0","Type":"ContainerStarted","Data":"2cfe008f6bf6b49cf236e578d11989f832366efffe94b41a72c67b1ab31cf447"} May 11 21:01:01.666837 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:01:01.666587 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:01:01.687708 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:01:01.687634 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" podStartSLOduration=7.87016428 podStartE2EDuration="14.687621036s" podCreationTimestamp="2026-05-11 21:00:47 +0000 UTC" firstStartedPulling="2026-05-11 21:00:54.638084212 +0000 UTC m=+599.850027340" lastFinishedPulling="2026-05-11 21:01:01.455540964 +0000 UTC m=+606.667484096" observedRunningTime="2026-05-11 21:01:01.685662415 +0000 UTC m=+606.897605590" watchObservedRunningTime="2026-05-11 21:01:01.687621036 +0000 UTC m=+606.899564186" May 11 21:01:02.629828 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:01:02.629789 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:01:02.642315 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:01:02.642286 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-ptcc7" May 11 21:01:02.643184 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:01:02.643166 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m" May 11 21:01:12.684782 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:01:12.684697 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc" May 11 21:02:43.499645 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:43.499561 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-99bcb79cc-2wrnm"] May 11 21:02:43.500207 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:43.499810 2555 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" podUID="93e0aecf-1f29-4e9e-b614-144118dc567c" containerName="manager" containerID="cri-o://a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34" gracePeriod=600 May 11 21:02:43.749449 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:43.749423 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" May 11 21:02:43.813279 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:43.813184 2555 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpd6z\" (UniqueName: \"kubernetes.io/projected/93e0aecf-1f29-4e9e-b614-144118dc567c-kube-api-access-dpd6z\") pod \"93e0aecf-1f29-4e9e-b614-144118dc567c\" (UID: \"93e0aecf-1f29-4e9e-b614-144118dc567c\") " May 11 21:02:43.815418 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:43.815376 2555 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e0aecf-1f29-4e9e-b614-144118dc567c-kube-api-access-dpd6z" (OuterVolumeSpecName: "kube-api-access-dpd6z") pod "93e0aecf-1f29-4e9e-b614-144118dc567c" (UID: "93e0aecf-1f29-4e9e-b614-144118dc567c"). InnerVolumeSpecName "kube-api-access-dpd6z". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 21:02:43.914862 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:43.914816 2555 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpd6z\" (UniqueName: \"kubernetes.io/projected/93e0aecf-1f29-4e9e-b614-144118dc567c-kube-api-access-dpd6z\") on node \"ip-10-0-133-205.ec2.internal\" DevicePath \"\"" May 11 21:02:44.040737 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.040704 2555 generic.go:358] "Generic (PLEG): container finished" podID="93e0aecf-1f29-4e9e-b614-144118dc567c" containerID="a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34" exitCode=0 May 11 21:02:44.040913 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.040790 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" event={"ID":"93e0aecf-1f29-4e9e-b614-144118dc567c","Type":"ContainerDied","Data":"a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34"} May 11 21:02:44.040913 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.040836 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" event={"ID":"93e0aecf-1f29-4e9e-b614-144118dc567c","Type":"ContainerDied","Data":"9ccaf44dc308f1331e8385417fa03d99c1fa74ea4e947454698c9e4554ff245b"} May 11 21:02:44.040913 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.040853 2555 scope.go:117] "RemoveContainer" containerID="a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34" May 11 21:02:44.040913 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.040805 2555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-99bcb79cc-2wrnm" May 11 21:02:44.049672 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.049654 2555 scope.go:117] "RemoveContainer" containerID="a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34" May 11 21:02:44.049946 ip-10-0-133-205 kubenswrapper[2555]: E0511 21:02:44.049926 2555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34\": container with ID starting with a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34 not found: ID does not exist" containerID="a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34" May 11 21:02:44.050007 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.049961 2555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34"} err="failed to get container status \"a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34\": rpc error: code = NotFound desc = could not find container \"a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34\": container with ID starting with a50666fd91877981b203fe19ff9167a91efcbc7c267fcaebc6009c67e5818a34 not found: ID does not exist" May 11 21:02:44.065176 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.065088 2555 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-99bcb79cc-2wrnm"] May 11 21:02:44.066924 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.066900 2555 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-99bcb79cc-2wrnm"] May 11 21:02:44.776232 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.776192 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-99bcb79cc-f9mlf"] May 11 21:02:44.776691 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.776674 2555 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93e0aecf-1f29-4e9e-b614-144118dc567c" containerName="manager" May 11 21:02:44.776691 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.776692 2555 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e0aecf-1f29-4e9e-b614-144118dc567c" containerName="manager" May 11 21:02:44.776787 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.776770 2555 memory_manager.go:356] "RemoveStaleState removing state" podUID="93e0aecf-1f29-4e9e-b614-144118dc567c" containerName="manager" May 11 21:02:44.779450 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.779429 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-99bcb79cc-f9mlf" May 11 21:02:44.782102 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.782082 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-dqcql\"" May 11 21:02:44.788349 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.788323 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-99bcb79cc-f9mlf"] May 11 21:02:44.924975 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:44.924925 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9xzk\" (UniqueName: \"kubernetes.io/projected/b44c9019-ef37-4d6f-969c-f4653ba640a1-kube-api-access-n9xzk\") pod \"maas-controller-99bcb79cc-f9mlf\" (UID: \"b44c9019-ef37-4d6f-969c-f4653ba640a1\") " pod="opendatahub/maas-controller-99bcb79cc-f9mlf" May 11 21:02:45.026307 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:45.026216 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9xzk\" (UniqueName: \"kubernetes.io/projected/b44c9019-ef37-4d6f-969c-f4653ba640a1-kube-api-access-n9xzk\") pod \"maas-controller-99bcb79cc-f9mlf\" (UID: \"b44c9019-ef37-4d6f-969c-f4653ba640a1\") " pod="opendatahub/maas-controller-99bcb79cc-f9mlf" May 11 21:02:45.036024 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:45.035986 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9xzk\" (UniqueName: \"kubernetes.io/projected/b44c9019-ef37-4d6f-969c-f4653ba640a1-kube-api-access-n9xzk\") pod \"maas-controller-99bcb79cc-f9mlf\" (UID: \"b44c9019-ef37-4d6f-969c-f4653ba640a1\") " pod="opendatahub/maas-controller-99bcb79cc-f9mlf" May 11 21:02:45.091109 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:45.091068 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-99bcb79cc-f9mlf" May 11 21:02:45.224970 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:45.224943 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-99bcb79cc-f9mlf"] May 11 21:02:45.227565 ip-10-0-133-205 kubenswrapper[2555]: W0511 21:02:45.227538 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb44c9019_ef37_4d6f_969c_f4653ba640a1.slice/crio-622aa3ecc1dbde77e0a2cf6d724d9654219261c73694140378ff2489540d1b18 WatchSource:0}: Error finding container 622aa3ecc1dbde77e0a2cf6d724d9654219261c73694140378ff2489540d1b18: Status 404 returned error can't find the container with id 622aa3ecc1dbde77e0a2cf6d724d9654219261c73694140378ff2489540d1b18 May 11 21:02:45.229131 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:45.229113 2555 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 21:02:45.360858 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:45.360777 2555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e0aecf-1f29-4e9e-b614-144118dc567c" path="/var/lib/kubelet/pods/93e0aecf-1f29-4e9e-b614-144118dc567c/volumes" May 11 21:02:46.050884 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:46.050844 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-99bcb79cc-f9mlf" event={"ID":"b44c9019-ef37-4d6f-969c-f4653ba640a1","Type":"ContainerStarted","Data":"c77b8eb3d90041b72ac5a5b3245e405eb5057d92bdf3fb976bea3c60a35a91e7"} May 11 21:02:46.050884 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:46.050885 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-99bcb79cc-f9mlf" event={"ID":"b44c9019-ef37-4d6f-969c-f4653ba640a1","Type":"ContainerStarted","Data":"622aa3ecc1dbde77e0a2cf6d724d9654219261c73694140378ff2489540d1b18"} May 11 21:02:46.051338 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:46.050929 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-99bcb79cc-f9mlf" May 11 21:02:46.069071 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:46.069007 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-99bcb79cc-f9mlf" podStartSLOduration=1.621349365 podStartE2EDuration="2.0689887s" podCreationTimestamp="2026-05-11 21:02:44 +0000 UTC" firstStartedPulling="2026-05-11 21:02:45.229246287 +0000 UTC m=+710.441189414" lastFinishedPulling="2026-05-11 21:02:45.676885622 +0000 UTC m=+710.888828749" observedRunningTime="2026-05-11 21:02:46.066924785 +0000 UTC m=+711.278867933" watchObservedRunningTime="2026-05-11 21:02:46.0689887 +0000 UTC m=+711.280932067" May 11 21:02:57.064934 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:02:57.064900 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-99bcb79cc-f9mlf" May 11 21:05:55.320869 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:05:55.320780 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:05:55.323188 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:05:55.323168 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:10:55.348341 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:10:55.348314 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:10:55.353083 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:10:55.353063 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:15:55.376937 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:15:55.376905 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:15:55.382232 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:15:55.382212 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:20:55.404354 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:20:55.404322 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:20:55.410823 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:20:55.410804 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:23:43.330348 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:43.330278 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-79c9b695dc-rrv9h_a9527d82-772e-4c86-9a25-65807b9c013a/maas-api/0.log" May 11 21:23:43.539963 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:43.539919 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-99bcb79cc-f9mlf_b44c9019-ef37-4d6f-969c-f4653ba640a1/manager/0.log" May 11 21:23:43.870583 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:43.870549 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-755c95f69f-9ww5m_5066f764-081f-4e80-adf0-dcdd1bbd305e/manager/0.log" May 11 21:23:44.089825 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:44.089790 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-68c4fbbd6f-cgqs2_d1afc4aa-b707-4933-87fe-0f4f14784bc4/postgres/0.log" May 11 21:23:46.742092 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:46.742062 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-c8c9857f9-dq6t2_e6a246bf-3141-4770-86e3-df2186820341/kube-auth-proxy/0.log" May 11 21:23:46.958012 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:46.957950 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_payload-processing-58b6c8fdc7-5jz94_236ec381-63c0-4952-9f9a-c19f3959f837/payload-processing/0.log" May 11 21:23:47.513747 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:47.513717 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m_08c7721c-d61c-485c-a11d-f7dcd8467fa1/storage-initializer/0.log" May 11 21:23:47.521253 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:47.521216 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-cgp5m_08c7721c-d61c-485c-a11d-f7dcd8467fa1/main/0.log" May 11 21:23:47.627179 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:47.627152 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-ptcc7_5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a/storage-initializer/0.log" May 11 21:23:47.633438 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:47.633395 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-ptcc7_5dfe4ea7-8d30-4f67-88c6-1f3eac534c8a/main/0.log" May 11 21:23:47.739577 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:47.739547 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc_fc0b7649-4863-4c52-ab61-eb1b49e051a0/storage-initializer/0.log" May 11 21:23:47.746640 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:47.746614 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9jknc_fc0b7649-4863-4c52-ab61-eb1b49e051a0/main/0.log" May 11 21:23:59.996555 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:23:59.996518 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kdvpw_d88f86a6-6037-4ce7-badc-84d3ac37a7ce/global-pull-secret-syncer/0.log" May 11 21:24:00.082881 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:00.082825 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5szqz_7d27adc4-933c-4d24-bd8a-bc40d4f26e8c/konnectivity-agent/0.log" May 11 21:24:00.180235 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:00.180180 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-205.ec2.internal_09c2e9e5d3d1eb649c292303ae36692a/haproxy/0.log" May 11 21:24:06.361898 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.361833 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_caa21e61-935d-4cc5-890f-e3ef50fadd2a/alertmanager/0.log" May 11 21:24:06.390554 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.390530 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_caa21e61-935d-4cc5-890f-e3ef50fadd2a/config-reloader/0.log" May 11 21:24:06.414691 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.414667 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_caa21e61-935d-4cc5-890f-e3ef50fadd2a/kube-rbac-proxy-web/0.log" May 11 21:24:06.435964 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.435943 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_caa21e61-935d-4cc5-890f-e3ef50fadd2a/kube-rbac-proxy/0.log" May 11 21:24:06.461418 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.461386 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_caa21e61-935d-4cc5-890f-e3ef50fadd2a/kube-rbac-proxy-metric/0.log" May 11 21:24:06.488039 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.488013 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_caa21e61-935d-4cc5-890f-e3ef50fadd2a/prom-label-proxy/0.log" May 11 21:24:06.511668 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.511642 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_caa21e61-935d-4cc5-890f-e3ef50fadd2a/init-config-reloader/0.log" May 11 21:24:06.591858 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.591831 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7764dcf94f-tp4l8_4926c338-371e-490e-b158-6a4f75127e87/kube-state-metrics/0.log" May 11 21:24:06.620349 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.620261 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7764dcf94f-tp4l8_4926c338-371e-490e-b158-6a4f75127e87/kube-rbac-proxy-main/0.log" May 11 21:24:06.655999 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.655978 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7764dcf94f-tp4l8_4926c338-371e-490e-b158-6a4f75127e87/kube-rbac-proxy-self/0.log" May 11 21:24:06.688826 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.688804 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-56cfb48457-79rdn_e674bf6f-247c-4cf4-9b96-017b2d3afcd1/metrics-server/0.log" May 11 21:24:06.720874 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.720854 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-655d88fc6c-6ncns_e2575669-fb18-4ca8-88ff-2be6e01c5fa9/monitoring-plugin/0.log" May 11 21:24:06.933458 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.933379 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rk44g_85d9b11d-f50f-47a9-9ac6-8338abbe2824/node-exporter/0.log" May 11 21:24:06.957640 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.957618 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rk44g_85d9b11d-f50f-47a9-9ac6-8338abbe2824/kube-rbac-proxy/0.log" May 11 21:24:06.982516 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:06.982489 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rk44g_85d9b11d-f50f-47a9-9ac6-8338abbe2824/init-textfile/0.log" May 11 21:24:07.012381 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.012322 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5cc99f7c99-w5lm6_c7a50aad-97d1-425c-818e-7e0b02d396b7/kube-rbac-proxy-main/0.log" May 11 21:24:07.034555 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.034529 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5cc99f7c99-w5lm6_c7a50aad-97d1-425c-818e-7e0b02d396b7/kube-rbac-proxy-self/0.log" May 11 21:24:07.055791 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.055747 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5cc99f7c99-w5lm6_c7a50aad-97d1-425c-818e-7e0b02d396b7/openshift-state-metrics/0.log" May 11 21:24:07.089826 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.089803 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9dd9a319-76ec-4631-b200-211f0174ad87/prometheus/0.log" May 11 21:24:07.120221 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.120201 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9dd9a319-76ec-4631-b200-211f0174ad87/config-reloader/0.log" May 11 21:24:07.141719 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.141705 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9dd9a319-76ec-4631-b200-211f0174ad87/thanos-sidecar/0.log" May 11 21:24:07.163279 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.163259 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9dd9a319-76ec-4631-b200-211f0174ad87/kube-rbac-proxy-web/0.log" May 11 21:24:07.183437 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.183386 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9dd9a319-76ec-4631-b200-211f0174ad87/kube-rbac-proxy/0.log" May 11 21:24:07.203056 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.202998 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9dd9a319-76ec-4631-b200-211f0174ad87/kube-rbac-proxy-thanos/0.log" May 11 21:24:07.221881 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.221864 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9dd9a319-76ec-4631-b200-211f0174ad87/init-config-reloader/0.log" May 11 21:24:07.248292 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.248269 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-94789f4d5-885tt_b1663b6e-7a8e-43d2-92c1-04f006ccbe74/prometheus-operator/0.log" May 11 21:24:07.264610 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.264592 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-94789f4d5-885tt_b1663b6e-7a8e-43d2-92c1-04f006ccbe74/kube-rbac-proxy/0.log" May 11 21:24:07.298489 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.298462 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-64b84d7657-s5hhh_5c38a6fa-759b-4f50-a5d6-74ebbfbd6439/prometheus-operator-admission-webhook/0.log" May 11 21:24:07.324567 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.324546 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b57cbb8cd-gjvmq_23302639-969b-4a5a-bd5b-614af2afac30/telemeter-client/0.log" May 11 21:24:07.343082 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.343059 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b57cbb8cd-gjvmq_23302639-969b-4a5a-bd5b-614af2afac30/reload/0.log" May 11 21:24:07.362146 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.362126 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b57cbb8cd-gjvmq_23302639-969b-4a5a-bd5b-614af2afac30/kube-rbac-proxy/0.log" May 11 21:24:07.387990 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.387970 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79c5d8458-z59p7_d3c9d508-0fdf-437d-b335-2fa5981af881/thanos-query/0.log" May 11 21:24:07.407337 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.407319 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79c5d8458-z59p7_d3c9d508-0fdf-437d-b335-2fa5981af881/kube-rbac-proxy-web/0.log" May 11 21:24:07.427543 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.427524 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79c5d8458-z59p7_d3c9d508-0fdf-437d-b335-2fa5981af881/kube-rbac-proxy/0.log" May 11 21:24:07.447108 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.447090 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79c5d8458-z59p7_d3c9d508-0fdf-437d-b335-2fa5981af881/prom-label-proxy/0.log" May 11 21:24:07.470194 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.470145 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79c5d8458-z59p7_d3c9d508-0fdf-437d-b335-2fa5981af881/kube-rbac-proxy-rules/0.log" May 11 21:24:07.493785 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:07.493767 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79c5d8458-z59p7_d3c9d508-0fdf-437d-b335-2fa5981af881/kube-rbac-proxy-metrics/0.log" May 11 21:24:08.759973 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.759937 2555 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj"] May 11 21:24:08.764042 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.764019 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.766600 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.766574 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qcz2k\"/\"openshift-service-ca.crt\"" May 11 21:24:08.767635 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.767620 2555 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qcz2k\"/\"kube-root-ca.crt\"" May 11 21:24:08.767711 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.767629 2555 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qcz2k\"/\"default-dockercfg-4h4cq\"" May 11 21:24:08.773952 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.773929 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj"] May 11 21:24:08.887923 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.887891 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt9qs\" (UniqueName: \"kubernetes.io/projected/d3d0ac85-3627-4b0d-b772-32bef76f4464-kube-api-access-vt9qs\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.888096 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.887929 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-sys\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.888096 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.887960 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-podres\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.888096 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.887990 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-lib-modules\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.888096 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.888023 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-proc\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.989034 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.988998 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt9qs\" (UniqueName: \"kubernetes.io/projected/d3d0ac85-3627-4b0d-b772-32bef76f4464-kube-api-access-vt9qs\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.989034 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.989038 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-sys\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.989264 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.989068 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-podres\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.989264 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.989098 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-lib-modules\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.989264 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.989130 2555 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-proc\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.989264 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.989135 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-sys\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.989264 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.989191 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-proc\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.989478 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.989260 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-podres\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.989478 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.989279 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3d0ac85-3627-4b0d-b772-32bef76f4464-lib-modules\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:08.998569 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:08.998546 2555 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt9qs\" (UniqueName: \"kubernetes.io/projected/d3d0ac85-3627-4b0d-b772-32bef76f4464-kube-api-access-vt9qs\") pod \"perf-node-gather-daemonset-jgrcj\" (UID: \"d3d0ac85-3627-4b0d-b772-32bef76f4464\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:09.075515 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:09.075429 2555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:09.206185 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:09.206160 2555 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj"] May 11 21:24:09.209468 ip-10-0-133-205 kubenswrapper[2555]: W0511 21:24:09.208546 2555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd3d0ac85_3627_4b0d_b772_32bef76f4464.slice/crio-72efb324140871f562e4977e2982b84be66fba1e72b5cd39e2874767385ea01a WatchSource:0}: Error finding container 72efb324140871f562e4977e2982b84be66fba1e72b5cd39e2874767385ea01a: Status 404 returned error can't find the container with id 72efb324140871f562e4977e2982b84be66fba1e72b5cd39e2874767385ea01a May 11 21:24:09.213094 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:09.213075 2555 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 21:24:09.692692 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:09.692654 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" event={"ID":"d3d0ac85-3627-4b0d-b772-32bef76f4464","Type":"ContainerStarted","Data":"1f13f22d15494a15a70eb21b4ebf236ec7ef94c8c26a263a990c6bb6ed90456b"} May 11 21:24:09.692692 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:09.692694 2555 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" event={"ID":"d3d0ac85-3627-4b0d-b772-32bef76f4464","Type":"ContainerStarted","Data":"72efb324140871f562e4977e2982b84be66fba1e72b5cd39e2874767385ea01a"} May 11 21:24:09.692902 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:09.692747 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:09.709690 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:09.709648 2555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" podStartSLOduration=1.70963442 podStartE2EDuration="1.70963442s" podCreationTimestamp="2026-05-11 21:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 21:24:09.707979308 +0000 UTC m=+1994.919922469" watchObservedRunningTime="2026-05-11 21:24:09.70963442 +0000 UTC m=+1994.921577604" May 11 21:24:11.001932 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:11.001900 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-f79n6_e9eef6b6-ed64-4eb2-aba7-77a3f6213f02/dns/0.log" May 11 21:24:11.024281 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:11.024254 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-f79n6_e9eef6b6-ed64-4eb2-aba7-77a3f6213f02/kube-rbac-proxy/0.log" May 11 21:24:11.089589 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:11.089560 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sgkpq_38fd286b-fbc6-4171-8c30-db06a4c25fe9/dns-node-resolver/0.log" May 11 21:24:11.636976 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:11.636951 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qvlmw_bfdfde03-4872-4d6b-a541-c904d768028c/node-ca/0.log" May 11 21:24:12.636030 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:12.636001 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-c8c9857f9-dq6t2_e6a246bf-3141-4770-86e3-df2186820341/kube-auth-proxy/0.log" May 11 21:24:12.740891 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:12.740860 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_payload-processing-58b6c8fdc7-5jz94_236ec381-63c0-4952-9f9a-c19f3959f837/payload-processing/0.log" May 11 21:24:13.258910 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:13.258883 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7lkng_903aee12-cd22-4c02-ba34-3a042f47b6b5/serve-healthcheck-canary/0.log" May 11 21:24:13.864659 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:13.864631 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xkrjf_a4119420-0b77-4749-8a7f-0a8814c65f64/kube-rbac-proxy/0.log" May 11 21:24:13.882789 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:13.882769 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xkrjf_a4119420-0b77-4749-8a7f-0a8814c65f64/exporter/0.log" May 11 21:24:13.905809 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:13.905784 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xkrjf_a4119420-0b77-4749-8a7f-0a8814c65f64/extractor/0.log" May 11 21:24:15.706480 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:15.706442 2555 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-jgrcj" May 11 21:24:15.829498 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:15.829470 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-79c9b695dc-rrv9h_a9527d82-772e-4c86-9a25-65807b9c013a/maas-api/0.log" May 11 21:24:15.910095 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:15.910066 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-99bcb79cc-f9mlf_b44c9019-ef37-4d6f-969c-f4653ba640a1/manager/0.log" May 11 21:24:15.998323 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:15.998293 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-755c95f69f-9ww5m_5066f764-081f-4e80-adf0-dcdd1bbd305e/manager/0.log" May 11 21:24:16.074064 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:16.074037 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-68c4fbbd6f-cgqs2_d1afc4aa-b707-4933-87fe-0f4f14784bc4/postgres/0.log" May 11 21:24:17.172022 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:17.171992 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-68d9b68cf6-46w95_bec15022-9690-465c-b505-e8b5172de2ed/manager/0.log" May 11 21:24:22.741977 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:22.741940 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4hgt9_93766b7c-6f9e-4bb2-a35e-9104fc3059f6/kube-multus/0.log" May 11 21:24:23.104755 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:23.104679 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mkbtt_fa6aa941-a7a4-40a4-82c1-046fa5c671d1/kube-multus-additional-cni-plugins/0.log" May 11 21:24:23.127766 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:23.127742 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mkbtt_fa6aa941-a7a4-40a4-82c1-046fa5c671d1/egress-router-binary-copy/0.log" May 11 21:24:23.150328 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:23.150305 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mkbtt_fa6aa941-a7a4-40a4-82c1-046fa5c671d1/cni-plugins/0.log" May 11 21:24:23.171113 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:23.171092 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mkbtt_fa6aa941-a7a4-40a4-82c1-046fa5c671d1/bond-cni-plugin/0.log" May 11 21:24:23.193827 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:23.193804 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mkbtt_fa6aa941-a7a4-40a4-82c1-046fa5c671d1/routeoverride-cni/0.log" May 11 21:24:23.213845 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:23.213817 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mkbtt_fa6aa941-a7a4-40a4-82c1-046fa5c671d1/whereabouts-cni-bincopy/0.log" May 11 21:24:23.233271 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:23.233246 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mkbtt_fa6aa941-a7a4-40a4-82c1-046fa5c671d1/whereabouts-cni/0.log" May 11 21:24:23.326353 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:23.326325 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fq6hx_692ffb95-b8bb-4e21-9e37-a9bad55c11be/network-metrics-daemon/0.log" May 11 21:24:23.345729 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:23.345705 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fq6hx_692ffb95-b8bb-4e21-9e37-a9bad55c11be/kube-rbac-proxy/0.log" May 11 21:24:24.173687 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:24.173662 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-controller/0.log" May 11 21:24:24.190165 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:24.190140 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/0.log" May 11 21:24:24.199522 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:24.199502 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovn-acl-logging/1.log" May 11 21:24:24.215210 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:24.215186 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/kube-rbac-proxy-node/0.log" May 11 21:24:24.236177 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:24.236146 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/kube-rbac-proxy-ovn-metrics/0.log" May 11 21:24:24.256733 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:24.256710 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/northd/0.log" May 11 21:24:24.277471 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:24.277446 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/nbdb/0.log" May 11 21:24:24.305080 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:24.305058 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/sbdb/0.log" May 11 21:24:24.412048 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:24.412015 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gkzk7_5a8d1588-0bb4-436d-88d7-4920b143287d/ovnkube-controller/0.log" May 11 21:24:26.063319 ip-10-0-133-205 kubenswrapper[2555]: I0511 21:24:26.063292 2555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6z6rl_ff9e0b72-a1d2-4476-a4d4-db6a3425a266/network-check-target-container/0.log"