Apr 20 22:24:30.259012 ip-10-0-132-177 systemd[1]: Starting Kubernetes Kubelet... Apr 20 22:24:30.734032 ip-10-0-132-177 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 22:24:30.734032 ip-10-0-132-177 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 22:24:30.734032 ip-10-0-132-177 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 22:24:30.734032 ip-10-0-132-177 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 22:24:30.734032 ip-10-0-132-177 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 22:24:30.734849 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.734751 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 22:24:30.740240 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740223 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740243 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740247 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740251 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740254 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740257 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740260 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740263 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740266 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740270 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740272 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740275 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740278 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740280 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740283 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740287 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:30.740283 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740290 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740293 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740295 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740298 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740301 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740304 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740306 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740309 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740311 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740314 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740316 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740319 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740321 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740324 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740326 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740329 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740331 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740334 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740336 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740339 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:30.740660 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740342 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740344 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740347 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740349 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740352 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740355 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740357 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740359 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740362 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740364 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740366 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740369 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740371 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740374 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740377 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740381 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740383 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740385 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740388 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740390 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:30.741179 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740393 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740395 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740401 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740404 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740407 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740410 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740413 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740416 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740418 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740421 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740424 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740427 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740432 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740434 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740437 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740439 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740442 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740444 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740447 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:30.741689 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740449 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740452 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740454 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740457 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740459 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740462 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740464 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740468 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740471 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740474 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740476 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740893 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740898 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740901 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740904 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740907 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740910 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740912 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740916 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740918 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:30.742141 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740921 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740923 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740926 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740935 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740937 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740940 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740943 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740945 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740948 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740951 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740953 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740956 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740958 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740961 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740964 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740967 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740969 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740972 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740974 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740978 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:30.742690 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740981 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740983 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740986 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740988 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740990 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740993 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740995 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.740998 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741002 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741006 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741008 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741011 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741013 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741016 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741018 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741021 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741028 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741031 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741034 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:30.743180 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741036 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741038 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741041 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741044 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741047 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741049 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741051 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741054 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741057 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741059 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741061 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741064 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741067 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741070 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741073 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741075 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741078 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741080 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741083 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:30.743693 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741085 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741088 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741090 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741092 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741095 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741097 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741099 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741102 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741104 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741107 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741109 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741116 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741119 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741122 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741124 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741126 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741129 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741132 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.741135 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742707 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 22:24:30.744153 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742719 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742725 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742730 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742734 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742738 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742743 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742747 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742751 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742754 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742757 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742761 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742764 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742767 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742770 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742773 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742776 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742779 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742782 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742787 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742790 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742793 2575 flags.go:64] FLAG: --config-dir="" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742796 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742800 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742804 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 22:24:30.744640 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742808 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742811 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742815 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742818 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742821 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742824 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742828 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742830 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742835 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742838 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742840 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742843 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742847 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742851 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742855 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742859 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742862 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742865 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742868 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742872 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742875 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742878 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742881 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742884 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742887 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 22:24:30.745239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742890 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742892 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742895 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742898 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742901 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742905 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742908 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742912 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742916 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742919 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742922 2575 flags.go:64] FLAG: --help="false" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742925 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-132-177.ec2.internal" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742929 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742932 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742935 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742939 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742942 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742945 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742948 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742950 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742953 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742956 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742959 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742962 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 22:24:30.745843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742965 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742968 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742971 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742973 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742976 2575 flags.go:64] FLAG: --lock-file="" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742979 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742982 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742985 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742990 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742993 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742996 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.742998 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743002 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743005 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743008 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743011 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743015 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743019 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743023 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743026 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743029 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743032 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743035 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743038 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743041 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 22:24:30.746423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743043 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743051 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743054 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743057 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743060 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743063 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743069 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743072 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743075 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743078 2575 flags.go:64] FLAG: --port="10250" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743081 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743084 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e26e6cc7f4cb13f0" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743087 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743090 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743093 2575 flags.go:64] FLAG: --register-node="true" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743096 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743099 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743102 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743105 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743108 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743111 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743115 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743118 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743122 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743125 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 22:24:30.747080 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743129 2575 flags.go:64] FLAG: --runonce="false" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743132 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743135 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743138 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743141 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743144 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743147 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743150 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743153 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743156 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743159 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743161 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743164 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743167 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743170 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743173 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743179 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743182 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743185 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743190 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743193 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743196 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743199 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743201 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743204 2575 flags.go:64] FLAG: --v="2" Apr 20 22:24:30.747701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743209 2575 flags.go:64] FLAG: --version="false" Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743213 2575 flags.go:64] FLAG: --vmodule="" Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743222 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.743226 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743337 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743341 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743345 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743349 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743353 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743355 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743358 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743361 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743363 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743366 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743368 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743371 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743373 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743376 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743379 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743381 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743384 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:30.748293 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743386 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743389 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743391 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743393 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743396 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743398 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743401 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743403 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743405 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743408 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743411 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743413 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743415 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743419 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743422 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743425 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743427 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743430 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743432 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743435 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:30.748823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743438 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743442 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743445 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743448 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743450 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743452 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743455 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743457 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743460 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743462 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743464 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743467 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743469 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743472 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743475 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743477 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743479 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743482 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743484 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743486 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:30.749326 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743489 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743491 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743494 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743496 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743498 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743502 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743504 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743507 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743510 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743512 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743514 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743517 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743519 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743524 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743527 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743529 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743532 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743534 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743537 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:30.750036 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743540 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743544 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743547 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743549 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743552 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743555 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743557 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743560 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743562 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.743564 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:30.750941 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.744248 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 22:24:30.751586 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.751563 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 22:24:30.751657 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.751587 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751667 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751693 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751697 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751702 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751707 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751712 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751719 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751726 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751730 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751734 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751739 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:30.751738 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751743 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751747 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751753 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751757 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751762 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751766 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751769 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751774 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751778 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751783 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751787 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751792 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751796 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751800 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751803 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751807 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751812 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751816 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751820 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751824 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:30.752267 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751829 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751833 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751837 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751841 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751845 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751850 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751854 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751858 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751862 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751866 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751870 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751875 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751879 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751883 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751888 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751894 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751901 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751907 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751912 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:30.753058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751917 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751921 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751926 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751931 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751936 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751941 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751945 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751949 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751954 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751959 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751963 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751967 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751972 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751976 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751981 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751985 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751989 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751993 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.751997 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752001 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:30.753532 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752006 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752010 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752014 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752018 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752022 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752027 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752031 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752037 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752042 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752047 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752051 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752056 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752060 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752064 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752068 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:30.754131 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752073 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.752082 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752298 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752308 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752313 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752317 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752322 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752326 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752330 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752334 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752338 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752343 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752347 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752351 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752355 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752360 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:30.754734 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752364 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752368 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752373 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752377 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752381 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752387 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752393 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752398 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752403 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752408 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752413 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752427 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752432 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752437 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752441 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752445 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752449 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752453 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752457 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:30.755423 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752461 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752465 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752469 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752473 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752478 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752482 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752485 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752490 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752494 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752498 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752502 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752506 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752510 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752514 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752518 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752522 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752526 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752530 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752535 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752538 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:30.756095 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752542 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752547 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752551 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752555 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752559 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752573 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752577 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752581 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752585 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752589 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752592 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752597 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752601 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752605 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752609 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752613 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752617 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752621 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752625 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752629 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:30.756737 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752633 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752637 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752641 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752645 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752649 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752654 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752657 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752663 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752688 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752692 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752696 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752700 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:30.752704 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.752713 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.753587 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 22:24:30.757212 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.756404 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 22:24:30.758648 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.758633 2575 server.go:1019] "Starting client certificate rotation" Apr 20 22:24:30.758757 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.758739 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 22:24:30.758800 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.758781 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 22:24:30.787739 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.787710 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 22:24:30.793382 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.793354 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 22:24:30.808864 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.808840 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 22:24:30.815533 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.815511 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 22:24:30.815533 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.815533 2575 log.go:25] "Validated CRI v1 image API" Apr 20 22:24:30.818145 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.818127 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 22:24:30.823201 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.823177 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b07d13e6-a7b8-4bde-be1e-084a433f949d:/dev/nvme0n1p4 fb83492f-4a6d-4d76-bbc8-f53476876cd2:/dev/nvme0n1p3] Apr 20 22:24:30.823260 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.823202 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 22:24:30.829490 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.829380 2575 manager.go:217] Machine: {Timestamp:2026-04-20 22:24:30.827522351 +0000 UTC m=+0.441678901 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3135202 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29d9a89be2317e50ae910773b0e1c6 SystemUUID:ec29d9a8-9be2-317e-50ae-910773b0e1c6 BootID:bd33f343-332d-4dd5-90cb-9146120feefb Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:71:47:64:c4:e3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:71:47:64:c4:e3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:91:80:35:f9:68 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 22:24:30.829490 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.829487 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 22:24:30.829628 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.829614 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 22:24:30.830723 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.830698 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 22:24:30.830885 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.830727 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-177.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 22:24:30.830932 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.830896 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 22:24:30.830932 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.830904 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 22:24:30.830932 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.830917 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 22:24:30.832560 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.832549 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 22:24:30.834007 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.833997 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 22:24:30.834123 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.834114 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 22:24:30.836822 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.836811 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 22:24:30.836865 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.836833 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 22:24:30.836865 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.836846 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 22:24:30.836865 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.836856 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 22:24:30.836865 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.836864 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 22:24:30.838030 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.838018 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 22:24:30.838069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.838037 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 22:24:30.839392 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.839373 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-k7rgz" Apr 20 22:24:30.842394 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.842377 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 22:24:30.844226 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.844211 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 22:24:30.845760 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845743 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 22:24:30.845805 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845773 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 22:24:30.845805 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845782 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 22:24:30.845805 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845788 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 22:24:30.845805 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845794 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 22:24:30.845805 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845800 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 22:24:30.845805 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845806 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 22:24:30.845964 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845811 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 22:24:30.845964 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845819 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 22:24:30.845964 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845825 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 22:24:30.845964 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845833 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 22:24:30.845964 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.845843 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 22:24:30.846591 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.846577 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-k7rgz" Apr 20 22:24:30.847778 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.847767 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 22:24:30.847813 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.847781 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 22:24:30.848161 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:30.848133 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 22:24:30.848243 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:30.848177 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-177.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 22:24:30.851569 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.851554 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 22:24:30.851628 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.851594 2575 server.go:1295] "Started kubelet" Apr 20 22:24:30.852375 ip-10-0-132-177 systemd[1]: Started Kubernetes Kubelet. Apr 20 22:24:30.852636 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.852591 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 22:24:30.852740 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.852663 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 22:24:30.852740 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.852614 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 22:24:30.853774 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.853756 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 22:24:30.854597 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.854582 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 22:24:30.859410 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.859383 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 22:24:30.859509 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.859423 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 22:24:30.860187 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.859991 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 22:24:30.860273 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860191 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 22:24:30.860273 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860257 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 22:24:30.860378 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860323 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 22:24:30.860378 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860330 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 22:24:30.860495 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860406 2575 factory.go:153] Registering CRI-O factory Apr 20 22:24:30.860495 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860422 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 22:24:30.860586 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860536 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 22:24:30.860586 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860546 2575 factory.go:55] Registering systemd factory Apr 20 22:24:30.860586 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860555 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 22:24:30.860586 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860577 2575 factory.go:103] Registering Raw factory Apr 20 22:24:30.860766 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.860590 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 22:24:30.860766 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:30.860702 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:30.861081 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.861064 2575 manager.go:319] Starting recovery of all containers Apr 20 22:24:30.865081 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.865058 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:30.866789 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:30.866766 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 22:24:30.867119 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.866956 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-177.ec2.internal" not found Apr 20 22:24:30.869643 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:30.869617 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-177.ec2.internal\" not found" node="ip-10-0-132-177.ec2.internal" Apr 20 22:24:30.873517 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.873502 2575 manager.go:324] Recovery completed Apr 20 22:24:30.877722 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.877707 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:24:30.880156 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.880137 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:24:30.880228 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.880170 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:24:30.880228 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.880183 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:24:30.880727 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.880714 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 22:24:30.880727 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.880726 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 22:24:30.880832 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.880742 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 22:24:30.882038 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.882025 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-177.ec2.internal" not found Apr 20 22:24:30.883283 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.883271 2575 policy_none.go:49] "None policy: Start" Apr 20 22:24:30.883324 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.883298 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 22:24:30.883324 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.883308 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 22:24:30.926415 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.926396 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:30.926529 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.926545 2575 server.go:85] "Starting device plugin registration server" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.926894 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.926909 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.927066 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.927152 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.927161 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:30.927898 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:30.927940 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:30.944340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:30.940404 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-177.ec2.internal" not found Apr 20 22:24:31.006950 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.006842 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 22:24:31.008263 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.008245 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 22:24:31.008366 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.008278 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 22:24:31.008366 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.008311 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 22:24:31.008366 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.008320 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 22:24:31.008366 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.008362 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 22:24:31.010707 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.010687 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:31.027601 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.027569 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:24:31.029380 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.029361 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:24:31.029476 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.029394 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:24:31.029476 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.029405 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:24:31.029476 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.029433 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.035713 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.035698 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.035784 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.035721 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-177.ec2.internal\": node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.057350 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.057321 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.109304 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.109253 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal"] Apr 20 22:24:31.109470 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.109370 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:24:31.110960 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.110941 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:24:31.111085 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.110973 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:24:31.111085 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.110987 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:24:31.112251 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.112234 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:24:31.112403 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.112388 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.112451 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.112419 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:24:31.113351 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.113322 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:24:31.113467 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.113356 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:24:31.113467 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.113370 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:24:31.113467 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.113372 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:24:31.113467 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.113390 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:24:31.113467 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.113399 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:24:31.114662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.114645 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.114780 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.114667 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:24:31.115803 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.115787 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:24:31.115888 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.115816 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:24:31.115888 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.115825 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:24:31.136309 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.136282 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-177.ec2.internal\" not found" node="ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.140488 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.140472 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-177.ec2.internal\" not found" node="ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.158046 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.158017 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.162396 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.162377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/27bb6de254cf19a2989bb62d9580d525-config\") pod \"kube-apiserver-proxy-ip-10-0-132-177.ec2.internal\" (UID: \"27bb6de254cf19a2989bb62d9580d525\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.162454 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.162410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0bdb7009a5728b8ce9a7775b0d8bb75e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal\" (UID: \"0bdb7009a5728b8ce9a7775b0d8bb75e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.162454 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.162429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bdb7009a5728b8ce9a7775b0d8bb75e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal\" (UID: \"0bdb7009a5728b8ce9a7775b0d8bb75e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.258633 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.258542 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.262976 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.262955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0bdb7009a5728b8ce9a7775b0d8bb75e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal\" (UID: \"0bdb7009a5728b8ce9a7775b0d8bb75e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.263060 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.262985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bdb7009a5728b8ce9a7775b0d8bb75e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal\" (UID: \"0bdb7009a5728b8ce9a7775b0d8bb75e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.263060 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.263002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/27bb6de254cf19a2989bb62d9580d525-config\") pod \"kube-apiserver-proxy-ip-10-0-132-177.ec2.internal\" (UID: \"27bb6de254cf19a2989bb62d9580d525\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.263060 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.263028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/27bb6de254cf19a2989bb62d9580d525-config\") pod \"kube-apiserver-proxy-ip-10-0-132-177.ec2.internal\" (UID: \"27bb6de254cf19a2989bb62d9580d525\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.263060 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.263046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bdb7009a5728b8ce9a7775b0d8bb75e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal\" (UID: \"0bdb7009a5728b8ce9a7775b0d8bb75e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.263183 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.263041 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0bdb7009a5728b8ce9a7775b0d8bb75e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal\" (UID: \"0bdb7009a5728b8ce9a7775b0d8bb75e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.359420 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.359378 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.438844 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.438817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.442507 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.442485 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal" Apr 20 22:24:31.460248 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.460224 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.560802 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.560702 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.661197 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.661161 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.757540 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.757512 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 22:24:31.758246 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.757697 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 22:24:31.758246 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.757740 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 22:24:31.761777 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.761760 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.777937 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.777914 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:31.849384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.849353 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 22:19:30 +0000 UTC" deadline="2027-10-10 11:32:58.757501426 +0000 UTC" Apr 20 22:24:31.849384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.849382 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12901h8m26.908121673s" Apr 20 22:24:31.860550 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.860527 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 22:24:31.862680 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.862654 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.877768 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.877750 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 22:24:31.891385 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.891363 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kmfw5" Apr 20 22:24:31.899326 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.899303 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kmfw5" Apr 20 22:24:31.963198 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:31.963173 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-177.ec2.internal\" not found" Apr 20 22:24:31.999908 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:31.999885 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:32.060783 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.060743 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" Apr 20 22:24:32.074835 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.074814 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 22:24:32.075794 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.075782 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal" Apr 20 22:24:32.086766 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.086735 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 22:24:32.123452 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:32.123425 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27bb6de254cf19a2989bb62d9580d525.slice/crio-8c2119d8ee99e336b64fb996469bba16c3b4658ee4b161f732396ffe2a787a71 WatchSource:0}: Error finding container 8c2119d8ee99e336b64fb996469bba16c3b4658ee4b161f732396ffe2a787a71: Status 404 returned error can't find the container with id 8c2119d8ee99e336b64fb996469bba16c3b4658ee4b161f732396ffe2a787a71 Apr 20 22:24:32.123833 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:32.123820 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bdb7009a5728b8ce9a7775b0d8bb75e.slice/crio-bda9fbbae0368c5473d08c34cad6bec0cc702b13e3158344954f16940c3b6c50 WatchSource:0}: Error finding container bda9fbbae0368c5473d08c34cad6bec0cc702b13e3158344954f16940c3b6c50: Status 404 returned error can't find the container with id bda9fbbae0368c5473d08c34cad6bec0cc702b13e3158344954f16940c3b6c50 Apr 20 22:24:32.128142 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.128127 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:24:32.837300 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.837266 2575 apiserver.go:52] "Watching apiserver" Apr 20 22:24:32.845567 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.845534 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 22:24:32.845970 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.845932 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq","openshift-cluster-node-tuning-operator/tuned-s8vzk","openshift-image-registry/node-ca-zf7wp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal","openshift-multus/multus-5m9gf","openshift-network-diagnostics/network-check-target-vjqq9","openshift-ovn-kubernetes/ovnkube-node-rp7bw","kube-system/konnectivity-agent-wglz9","openshift-dns/node-resolver-sjwlz","openshift-multus/multus-additional-cni-plugins-92xbc","openshift-multus/network-metrics-daemon-qg2mj","openshift-network-operator/iptables-alerter-r8nbs"] Apr 20 22:24:32.849534 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.849511 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:32.849632 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:32.849594 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:32.849937 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.849913 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:32.851862 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.851269 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.852584 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.852565 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:32.854125 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.854102 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.854858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.854836 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 22:24:32.855094 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.855071 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 22:24:32.855094 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.855093 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-hzkrk\"" Apr 20 22:24:32.855257 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.855231 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 22:24:32.855307 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.855294 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 22:24:32.856635 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.856291 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.856635 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.856502 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 22:24:32.856880 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.856865 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 22:24:32.857087 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.857069 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 22:24:32.857184 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.857088 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-n79gs\"" Apr 20 22:24:32.857313 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.857291 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 22:24:32.857603 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.857582 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 22:24:32.858062 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.858041 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n5pr5\"" Apr 20 22:24:32.858155 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.858118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:32.858293 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.858275 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 22:24:32.858551 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.858520 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:32.858874 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.858858 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 22:24:32.859033 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.859018 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nnvlk\"" Apr 20 22:24:32.860143 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.860122 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.861269 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.861249 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 22:24:32.861613 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.861597 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5j57c\"" Apr 20 22:24:32.861994 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.861976 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 22:24:32.863282 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.863265 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 22:24:32.863952 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.863928 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.864778 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.864104 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.864778 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.864487 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 22:24:32.864778 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.864522 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 22:24:32.864778 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.864606 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 22:24:32.864778 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.864648 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 22:24:32.864778 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.864779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 22:24:32.865082 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.864941 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pf45g\"" Apr 20 22:24:32.865689 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.865658 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:32.865878 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.865863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:32.865977 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:32.865959 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:32.868242 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.866783 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 22:24:32.868242 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.867749 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 22:24:32.868242 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.867892 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 22:24:32.868542 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.868526 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 22:24:32.868861 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.868847 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 22:24:32.869211 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.869197 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7bhr2\"" Apr 20 22:24:32.869346 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.869321 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qdkbs\"" Apr 20 22:24:32.869427 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.869206 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:32.869818 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.869800 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qg7l5\"" Apr 20 22:24:32.871015 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.870777 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 22:24:32.871015 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.870786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-447m5\" (UniqueName: \"kubernetes.io/projected/456ba91d-0822-42ce-a041-f73b13a803c5-kube-api-access-447m5\") pod \"node-ca-zf7wp\" (UID: \"456ba91d-0822-42ce-a041-f73b13a803c5\") " pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:32.871015 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.870846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-log-socket\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.871015 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.870876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-run-ovn-kubernetes\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.871015 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.870910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c237e12-2748-4be2-8f88-258e6064ea33-ovnkube-config\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.871015 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.870941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-system-cni-dir\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.871015 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.870990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c9389f21-c437-4990-a923-b0ff03e3ba21-hosts-file\") pod \"node-resolver-sjwlz\" (UID: \"c9389f21-c437-4990-a923-b0ff03e3ba21\") " pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.871334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-socket-dir-parent\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.871334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-daemon-config\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.871334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-kubernetes\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.871334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871257 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-sysctl-d\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.871334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c237e12-2748-4be2-8f88-258e6064ea33-env-overrides\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.871334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-system-cni-dir\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.871593 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-sys\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.871593 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-run-systemd\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.871593 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-var-lib-cni-bin\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.871593 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-sysconfig\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.871804 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-host\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.871804 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e440b6a-d5a8-43fe-af3d-a999f8dce281-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.871804 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871733 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-os-release\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.871804 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-var-lib-cni-multus\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.871978 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-var-lib-kubelet\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.871978 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ba8a927d-42db-4d3f-b6d1-938655219360-agent-certs\") pod \"konnectivity-agent-wglz9\" (UID: \"ba8a927d-42db-4d3f-b6d1-938655219360\") " pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:32.871978 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-sys-fs\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.871978 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/456ba91d-0822-42ce-a041-f73b13a803c5-host\") pod \"node-ca-zf7wp\" (UID: \"456ba91d-0822-42ce-a041-f73b13a803c5\") " pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:32.871978 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-var-lib-openvswitch\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.872167 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.871974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-etc-openvswitch\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.872167 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-tuned\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.872167 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-kubelet\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.872287 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-run-ovn\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.872287 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.872287 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-os-release\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.872400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ba8a927d-42db-4d3f-b6d1-938655219360-konnectivity-ca\") pod \"konnectivity-agent-wglz9\" (UID: \"ba8a927d-42db-4d3f-b6d1-938655219360\") " pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:32.872400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.872400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfn65\" (UniqueName: \"kubernetes.io/projected/3e440b6a-d5a8-43fe-af3d-a999f8dce281-kube-api-access-wfn65\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.872400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-run-multus-certs\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.872525 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872436 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-etc-kubernetes\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.872525 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-socket-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.872594 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872522 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-registration-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.872594 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-run-netns\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.872713 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f074946-73f8-4c67-9fb8-95e03ae600e5-tmp\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.872713 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-node-log\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.872790 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-cnibin\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.872837 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872810 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:32.872888 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872843 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-cnibin\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.872888 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-hostroot\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.872973 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-cni-bin\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.872973 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-cni-netd\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.873062 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.872972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lszxs\" (UniqueName: \"kubernetes.io/projected/2c237e12-2748-4be2-8f88-258e6064ea33-kube-api-access-lszxs\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.873062 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3e440b6a-d5a8-43fe-af3d-a999f8dce281-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.873207 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k4vh\" (UniqueName: \"kubernetes.io/projected/f0c17cb1-e694-4fe6-8bfb-113e266578ab-kube-api-access-7k4vh\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.873207 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-etc-selinux\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.873207 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-slash\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.873207 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-device-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.873207 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-lib-modules\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.873444 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-systemd-units\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.873444 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c9389f21-c437-4990-a923-b0ff03e3ba21-tmp-dir\") pod \"node-resolver-sjwlz\" (UID: \"c9389f21-c437-4990-a923-b0ff03e3ba21\") " pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.873444 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-run-netns\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.873444 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-modprobe-d\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.873444 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-var-lib-kubelet\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.873444 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/456ba91d-0822-42ce-a041-f73b13a803c5-serviceca\") pod \"node-ca-zf7wp\" (UID: \"456ba91d-0822-42ce-a041-f73b13a803c5\") " pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:32.873752 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c237e12-2748-4be2-8f88-258e6064ea33-ovn-node-metrics-cert\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.873752 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c237e12-2748-4be2-8f88-258e6064ea33-ovnkube-script-lib\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.873752 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj7sb\" (UniqueName: \"kubernetes.io/projected/c9389f21-c437-4990-a923-b0ff03e3ba21-kube-api-access-nj7sb\") pod \"node-resolver-sjwlz\" (UID: \"c9389f21-c437-4990-a923-b0ff03e3ba21\") " pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.873752 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f0c17cb1-e694-4fe6-8bfb-113e266578ab-cni-binary-copy\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.873752 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-conf-dir\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.873752 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-systemd\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.873752 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-run\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.874061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-run-openvswitch\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.874061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e440b6a-d5a8-43fe-af3d-a999f8dce281-cni-binary-copy\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.874061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.874061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-cni-dir\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.874061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-run-k8s-cni-cncf-io\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.874061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.873989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29r6\" (UniqueName: \"kubernetes.io/projected/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-kube-api-access-l29r6\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.874061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.874054 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-sysctl-conf\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.874376 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.874095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9269q\" (UniqueName: \"kubernetes.io/projected/2f074946-73f8-4c67-9fb8-95e03ae600e5-kube-api-access-9269q\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.901402 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.901330 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 22:19:31 +0000 UTC" deadline="2027-12-22 18:27:57.980610309 +0000 UTC" Apr 20 22:24:32.901402 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.901362 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14660h3m25.079251979s" Apr 20 22:24:32.961250 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.961221 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 22:24:32.975302 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.975302 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-os-release\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.975528 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ba8a927d-42db-4d3f-b6d1-938655219360-konnectivity-ca\") pod \"konnectivity-agent-wglz9\" (UID: \"ba8a927d-42db-4d3f-b6d1-938655219360\") " pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:32.975528 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:32.975528 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9xmf\" (UniqueName: \"kubernetes.io/projected/b6bf5cde-6108-4a7b-953c-d93acd974a19-kube-api-access-f9xmf\") pod \"iptables-alerter-r8nbs\" (UID: \"b6bf5cde-6108-4a7b-953c-d93acd974a19\") " pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:32.975528 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-os-release\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.975528 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.975785 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfn65\" (UniqueName: \"kubernetes.io/projected/3e440b6a-d5a8-43fe-af3d-a999f8dce281-kube-api-access-wfn65\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.975785 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.975785 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-run-multus-certs\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.975785 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-etc-kubernetes\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.975785 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbcqj\" (UniqueName: \"kubernetes.io/projected/5add223c-497e-4cc3-863e-339b6f999506-kube-api-access-lbcqj\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:32.975785 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-socket-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.975785 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-registration-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.975785 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-run-netns\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f074946-73f8-4c67-9fb8-95e03ae600e5-tmp\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-node-log\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-cnibin\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ba8a927d-42db-4d3f-b6d1-938655219360-konnectivity-ca\") pod \"konnectivity-agent-wglz9\" (UID: \"ba8a927d-42db-4d3f-b6d1-938655219360\") " pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975964 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-registration-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-cnibin\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.975997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-hostroot\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-node-log\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-cni-bin\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-cni-netd\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lszxs\" (UniqueName: \"kubernetes.io/projected/2c237e12-2748-4be2-8f88-258e6064ea33-kube-api-access-lszxs\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3e440b6a-d5a8-43fe-af3d-a999f8dce281-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k4vh\" (UniqueName: \"kubernetes.io/projected/f0c17cb1-e694-4fe6-8bfb-113e266578ab-kube-api-access-7k4vh\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6bf5cde-6108-4a7b-953c-d93acd974a19-iptables-alerter-script\") pod \"iptables-alerter-r8nbs\" (UID: \"b6bf5cde-6108-4a7b-953c-d93acd974a19\") " pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-etc-selinux\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.976205 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-slash\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976192 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-device-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-run-netns\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-device-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-run-multus-certs\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-cni-netd\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-etc-kubernetes\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976331 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-slash\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-etc-selinux\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-cnibin\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-hostroot\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-cnibin\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-cni-bin\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976518 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-socket-dir\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-lib-modules\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-systemd-units\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.977017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c9389f21-c437-4990-a923-b0ff03e3ba21-tmp-dir\") pod \"node-resolver-sjwlz\" (UID: \"c9389f21-c437-4990-a923-b0ff03e3ba21\") " pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-run-netns\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-lib-modules\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6bf5cde-6108-4a7b-953c-d93acd974a19-host-slash\") pod \"iptables-alerter-r8nbs\" (UID: \"b6bf5cde-6108-4a7b-953c-d93acd974a19\") " pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976711 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-run-netns\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-modprobe-d\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-var-lib-kubelet\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-systemd-units\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/456ba91d-0822-42ce-a041-f73b13a803c5-serviceca\") pod \"node-ca-zf7wp\" (UID: \"456ba91d-0822-42ce-a041-f73b13a803c5\") " pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-var-lib-kubelet\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c237e12-2748-4be2-8f88-258e6064ea33-ovn-node-metrics-cert\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c237e12-2748-4be2-8f88-258e6064ea33-ovnkube-script-lib\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj7sb\" (UniqueName: \"kubernetes.io/projected/c9389f21-c437-4990-a923-b0ff03e3ba21-kube-api-access-nj7sb\") pod \"node-resolver-sjwlz\" (UID: \"c9389f21-c437-4990-a923-b0ff03e3ba21\") " pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-modprobe-d\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f0c17cb1-e694-4fe6-8bfb-113e266578ab-cni-binary-copy\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c9389f21-c437-4990-a923-b0ff03e3ba21-tmp-dir\") pod \"node-resolver-sjwlz\" (UID: \"c9389f21-c437-4990-a923-b0ff03e3ba21\") " pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-conf-dir\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-systemd\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.977858 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.976978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3e440b6a-d5a8-43fe-af3d-a999f8dce281-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-run\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977045 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-run\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-systemd\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-conf-dir\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/456ba91d-0822-42ce-a041-f73b13a803c5-serviceca\") pod \"node-ca-zf7wp\" (UID: \"456ba91d-0822-42ce-a041-f73b13a803c5\") " pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-run-openvswitch\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e440b6a-d5a8-43fe-af3d-a999f8dce281-cni-binary-copy\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-cni-dir\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-run-k8s-cni-cncf-io\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l29r6\" (UniqueName: \"kubernetes.io/projected/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-kube-api-access-l29r6\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-sysctl-conf\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c237e12-2748-4be2-8f88-258e6064ea33-ovnkube-script-lib\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9269q\" (UniqueName: \"kubernetes.io/projected/2f074946-73f8-4c67-9fb8-95e03ae600e5-kube-api-access-9269q\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-447m5\" (UniqueName: \"kubernetes.io/projected/456ba91d-0822-42ce-a041-f73b13a803c5-kube-api-access-447m5\") pod \"node-ca-zf7wp\" (UID: \"456ba91d-0822-42ce-a041-f73b13a803c5\") " pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-log-socket\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.978662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-run-ovn-kubernetes\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c237e12-2748-4be2-8f88-258e6064ea33-ovnkube-config\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-system-cni-dir\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c9389f21-c437-4990-a923-b0ff03e3ba21-hosts-file\") pod \"node-resolver-sjwlz\" (UID: \"c9389f21-c437-4990-a923-b0ff03e3ba21\") " pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-socket-dir-parent\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-daemon-config\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-kubernetes\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.977980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-sysctl-d\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c237e12-2748-4be2-8f88-258e6064ea33-env-overrides\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f0c17cb1-e694-4fe6-8bfb-113e266578ab-cni-binary-copy\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-system-cni-dir\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978078 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e440b6a-d5a8-43fe-af3d-a999f8dce281-cni-binary-copy\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-run-ovn-kubernetes\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978135 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-log-socket\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-run-openvswitch\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-sysctl-conf\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c9389f21-c437-4990-a923-b0ff03e3ba21-hosts-file\") pod \"node-resolver-sjwlz\" (UID: \"c9389f21-c437-4990-a923-b0ff03e3ba21\") " pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.979443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-sys\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-run-systemd\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-cni-dir\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-var-lib-cni-bin\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978419 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-var-lib-cni-bin\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-sys\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-run-k8s-cni-cncf-io\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-run-systemd\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-sysconfig\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-socket-dir-parent\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-host\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-sysconfig\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e440b6a-d5a8-43fe-af3d-a999f8dce281-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-kubernetes\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-os-release\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978728 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c237e12-2748-4be2-8f88-258e6064ea33-ovnkube-config\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-sysctl-d\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f074946-73f8-4c67-9fb8-95e03ae600e5-host\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.980286 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-var-lib-cni-multus\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-var-lib-kubelet\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978803 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e440b6a-d5a8-43fe-af3d-a999f8dce281-system-cni-dir\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ba8a927d-42db-4d3f-b6d1-938655219360-agent-certs\") pod \"konnectivity-agent-wglz9\" (UID: \"ba8a927d-42db-4d3f-b6d1-938655219360\") " pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.978880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-var-lib-cni-multus\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-host-var-lib-kubelet\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c237e12-2748-4be2-8f88-258e6064ea33-env-overrides\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e440b6a-d5a8-43fe-af3d-a999f8dce281-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-sys-fs\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-system-cni-dir\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/456ba91d-0822-42ce-a041-f73b13a803c5-host\") pod \"node-ca-zf7wp\" (UID: \"456ba91d-0822-42ce-a041-f73b13a803c5\") " pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979219 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-var-lib-openvswitch\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979223 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f0c17cb1-e694-4fe6-8bfb-113e266578ab-os-release\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-etc-openvswitch\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-sys-fs\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979290 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f074946-73f8-4c67-9fb8-95e03ae600e5-tmp\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-tuned\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-var-lib-openvswitch\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981002 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-kubelet\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981647 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/456ba91d-0822-42ce-a041-f73b13a803c5-host\") pod \"node-ca-zf7wp\" (UID: \"456ba91d-0822-42ce-a041-f73b13a803c5\") " pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:32.981647 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-host-kubelet\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981647 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979468 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-run-ovn\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981647 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-run-ovn\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981647 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.979935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c237e12-2748-4be2-8f88-258e6064ea33-etc-openvswitch\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981647 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.980380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f0c17cb1-e694-4fe6-8bfb-113e266578ab-multus-daemon-config\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.981647 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.980865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c237e12-2748-4be2-8f88-258e6064ea33-ovn-node-metrics-cert\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.981647 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.981592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ba8a927d-42db-4d3f-b6d1-938655219360-agent-certs\") pod \"konnectivity-agent-wglz9\" (UID: \"ba8a927d-42db-4d3f-b6d1-938655219360\") " pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:32.981909 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.981755 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2f074946-73f8-4c67-9fb8-95e03ae600e5-etc-tuned\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.993868 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.993837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k4vh\" (UniqueName: \"kubernetes.io/projected/f0c17cb1-e694-4fe6-8bfb-113e266578ab-kube-api-access-7k4vh\") pod \"multus-5m9gf\" (UID: \"f0c17cb1-e694-4fe6-8bfb-113e266578ab\") " pod="openshift-multus/multus-5m9gf" Apr 20 22:24:32.995161 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:32.995141 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:32.995254 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:32.995167 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:32.995254 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:32.995178 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kb44g for pod openshift-network-diagnostics/network-check-target-vjqq9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:32.995254 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:32.995238 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g podName:e9c331e6-87b9-45b5-9c22-016575eec846 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:33.495218308 +0000 UTC m=+3.109374847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kb44g" (UniqueName: "kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g") pod "network-check-target-vjqq9" (UID: "e9c331e6-87b9-45b5-9c22-016575eec846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:32.998051 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.998026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9269q\" (UniqueName: \"kubernetes.io/projected/2f074946-73f8-4c67-9fb8-95e03ae600e5-kube-api-access-9269q\") pod \"tuned-s8vzk\" (UID: \"2f074946-73f8-4c67-9fb8-95e03ae600e5\") " pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:32.998584 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.998556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lszxs\" (UniqueName: \"kubernetes.io/projected/2c237e12-2748-4be2-8f88-258e6064ea33-kube-api-access-lszxs\") pod \"ovnkube-node-rp7bw\" (UID: \"2c237e12-2748-4be2-8f88-258e6064ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:32.998759 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.998733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29r6\" (UniqueName: \"kubernetes.io/projected/b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231-kube-api-access-l29r6\") pod \"aws-ebs-csi-driver-node-4chjq\" (UID: \"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:32.998947 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.998918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj7sb\" (UniqueName: \"kubernetes.io/projected/c9389f21-c437-4990-a923-b0ff03e3ba21-kube-api-access-nj7sb\") pod \"node-resolver-sjwlz\" (UID: \"c9389f21-c437-4990-a923-b0ff03e3ba21\") " pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:32.999150 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:32.999129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfn65\" (UniqueName: \"kubernetes.io/projected/3e440b6a-d5a8-43fe-af3d-a999f8dce281-kube-api-access-wfn65\") pod \"multus-additional-cni-plugins-92xbc\" (UID: \"3e440b6a-d5a8-43fe-af3d-a999f8dce281\") " pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:33.000126 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.000106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-447m5\" (UniqueName: \"kubernetes.io/projected/456ba91d-0822-42ce-a041-f73b13a803c5-kube-api-access-447m5\") pod \"node-ca-zf7wp\" (UID: \"456ba91d-0822-42ce-a041-f73b13a803c5\") " pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:33.013990 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.013943 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal" event={"ID":"27bb6de254cf19a2989bb62d9580d525","Type":"ContainerStarted","Data":"8c2119d8ee99e336b64fb996469bba16c3b4658ee4b161f732396ffe2a787a71"} Apr 20 22:24:33.014979 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.014956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" event={"ID":"0bdb7009a5728b8ce9a7775b0d8bb75e","Type":"ContainerStarted","Data":"bda9fbbae0368c5473d08c34cad6bec0cc702b13e3158344954f16940c3b6c50"} Apr 20 22:24:33.080591 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.080550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:33.080774 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.080601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9xmf\" (UniqueName: \"kubernetes.io/projected/b6bf5cde-6108-4a7b-953c-d93acd974a19-kube-api-access-f9xmf\") pod \"iptables-alerter-r8nbs\" (UID: \"b6bf5cde-6108-4a7b-953c-d93acd974a19\") " pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:33.080774 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.080634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbcqj\" (UniqueName: \"kubernetes.io/projected/5add223c-497e-4cc3-863e-339b6f999506-kube-api-access-lbcqj\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:33.080774 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.080697 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6bf5cde-6108-4a7b-953c-d93acd974a19-iptables-alerter-script\") pod \"iptables-alerter-r8nbs\" (UID: \"b6bf5cde-6108-4a7b-953c-d93acd974a19\") " pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:33.080774 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.080713 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:33.080774 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.080729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6bf5cde-6108-4a7b-953c-d93acd974a19-host-slash\") pod \"iptables-alerter-r8nbs\" (UID: \"b6bf5cde-6108-4a7b-953c-d93acd974a19\") " pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:33.081061 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.080796 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs podName:5add223c-497e-4cc3-863e-339b6f999506 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:33.580775165 +0000 UTC m=+3.194931722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs") pod "network-metrics-daemon-qg2mj" (UID: "5add223c-497e-4cc3-863e-339b6f999506") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:33.081061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.080822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6bf5cde-6108-4a7b-953c-d93acd974a19-host-slash\") pod \"iptables-alerter-r8nbs\" (UID: \"b6bf5cde-6108-4a7b-953c-d93acd974a19\") " pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:33.081349 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.081328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6bf5cde-6108-4a7b-953c-d93acd974a19-iptables-alerter-script\") pod \"iptables-alerter-r8nbs\" (UID: \"b6bf5cde-6108-4a7b-953c-d93acd974a19\") " pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:33.093224 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.093146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9xmf\" (UniqueName: \"kubernetes.io/projected/b6bf5cde-6108-4a7b-953c-d93acd974a19-kube-api-access-f9xmf\") pod \"iptables-alerter-r8nbs\" (UID: \"b6bf5cde-6108-4a7b-953c-d93acd974a19\") " pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:33.097011 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.096978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbcqj\" (UniqueName: \"kubernetes.io/projected/5add223c-497e-4cc3-863e-339b6f999506-kube-api-access-lbcqj\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:33.098933 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.098910 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nh9q7"] Apr 20 22:24:33.101790 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.101769 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.101901 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.101845 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:33.168308 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.168277 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" Apr 20 22:24:33.179070 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.179042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zf7wp" Apr 20 22:24:33.181856 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.181836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-dbus\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.181919 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.181869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.181919 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.181909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-kubelet-config\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.188646 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.188622 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5m9gf" Apr 20 22:24:33.193306 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.193284 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" Apr 20 22:24:33.201975 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.201953 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:33.210723 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.210700 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:33.218404 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.218376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sjwlz" Apr 20 22:24:33.225934 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.225901 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-92xbc" Apr 20 22:24:33.234546 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.234517 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r8nbs" Apr 20 22:24:33.249540 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.249516 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:33.283076 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.283042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.283246 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.283094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-kubelet-config\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.283246 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.283134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-dbus\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.283246 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.283194 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:33.283246 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.283237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-kubelet-config\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.283449 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.283263 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret podName:72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:33.783248806 +0000 UTC m=+3.397405342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret") pod "global-pull-secret-syncer-nh9q7" (UID: "72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33") : object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:33.283449 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.283281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-dbus\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.585715 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.585662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:33.585883 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.585726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:33.585883 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.585794 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:33.585883 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.585843 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:33.585883 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.585855 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:33.585883 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.585864 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kb44g for pod openshift-network-diagnostics/network-check-target-vjqq9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:33.585883 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.585875 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs podName:5add223c-497e-4cc3-863e-339b6f999506 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:34.585856136 +0000 UTC m=+4.200012677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs") pod "network-metrics-daemon-qg2mj" (UID: "5add223c-497e-4cc3-863e-339b6f999506") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:33.586162 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.585900 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g podName:e9c331e6-87b9-45b5-9c22-016575eec846 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:34.585888237 +0000 UTC m=+4.200044773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kb44g" (UniqueName: "kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g") pod "network-check-target-vjqq9" (UID: "e9c331e6-87b9-45b5-9c22-016575eec846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:33.787523 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.787290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:33.787630 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.787439 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:33.787710 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:33.787657 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret podName:72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:34.787639632 +0000 UTC m=+4.401796174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret") pod "global-pull-secret-syncer-nh9q7" (UID: "72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33") : object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:33.796229 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:33.796201 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb80f6f39_9e21_4ca3_a8d5_cc8ee7b04231.slice/crio-a712b63835d4b96579ccf25183780a0a8a22435dc64e5faf8508d1f972fa419b WatchSource:0}: Error finding container a712b63835d4b96579ccf25183780a0a8a22435dc64e5faf8508d1f972fa419b: Status 404 returned error can't find the container with id a712b63835d4b96579ccf25183780a0a8a22435dc64e5faf8508d1f972fa419b Apr 20 22:24:33.802130 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:33.802101 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c237e12_2748_4be2_8f88_258e6064ea33.slice/crio-6b5c0ab366cc591453179076426e124ac821e1a73b2c5e744ee5113474dc0ef1 WatchSource:0}: Error finding container 6b5c0ab366cc591453179076426e124ac821e1a73b2c5e744ee5113474dc0ef1: Status 404 returned error can't find the container with id 6b5c0ab366cc591453179076426e124ac821e1a73b2c5e744ee5113474dc0ef1 Apr 20 22:24:33.802997 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:33.802971 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f074946_73f8_4c67_9fb8_95e03ae600e5.slice/crio-edde5664fd6f2462c0774d017798bb613c4ec5f8e34ffb9b34b78bd7621c2353 WatchSource:0}: Error finding container edde5664fd6f2462c0774d017798bb613c4ec5f8e34ffb9b34b78bd7621c2353: Status 404 returned error can't find the container with id edde5664fd6f2462c0774d017798bb613c4ec5f8e34ffb9b34b78bd7621c2353 Apr 20 22:24:33.806077 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:33.804195 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba8a927d_42db_4d3f_b6d1_938655219360.slice/crio-4b37eb33d646a292c0391fc3dc33bb763d9c9d4bd71c4dfc53ca7f6decde901f WatchSource:0}: Error finding container 4b37eb33d646a292c0391fc3dc33bb763d9c9d4bd71c4dfc53ca7f6decde901f: Status 404 returned error can't find the container with id 4b37eb33d646a292c0391fc3dc33bb763d9c9d4bd71c4dfc53ca7f6decde901f Apr 20 22:24:33.806077 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:33.804803 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e440b6a_d5a8_43fe_af3d_a999f8dce281.slice/crio-bc9a76e8d93beb3e65b083504b5b063ef7856d3997e2d8d7419e0f79e2269dc8 WatchSource:0}: Error finding container bc9a76e8d93beb3e65b083504b5b063ef7856d3997e2d8d7419e0f79e2269dc8: Status 404 returned error can't find the container with id bc9a76e8d93beb3e65b083504b5b063ef7856d3997e2d8d7419e0f79e2269dc8 Apr 20 22:24:33.806262 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:33.806194 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9389f21_c437_4990_a923_b0ff03e3ba21.slice/crio-21b9571cf228fd1ca9d7d941c138bb010c4f98c2e55e3e9400f44f11b8725aa6 WatchSource:0}: Error finding container 21b9571cf228fd1ca9d7d941c138bb010c4f98c2e55e3e9400f44f11b8725aa6: Status 404 returned error can't find the container with id 21b9571cf228fd1ca9d7d941c138bb010c4f98c2e55e3e9400f44f11b8725aa6 Apr 20 22:24:33.807406 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:33.807188 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod456ba91d_0822_42ce_a041_f73b13a803c5.slice/crio-9243b6aa5f30bd5cf87985b5f0b7647989c0fe81ac418dcf52c8aaa79d42ed76 WatchSource:0}: Error finding container 9243b6aa5f30bd5cf87985b5f0b7647989c0fe81ac418dcf52c8aaa79d42ed76: Status 404 returned error can't find the container with id 9243b6aa5f30bd5cf87985b5f0b7647989c0fe81ac418dcf52c8aaa79d42ed76 Apr 20 22:24:33.807851 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:24:33.807816 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6bf5cde_6108_4a7b_953c_d93acd974a19.slice/crio-d0dcb187d1279cba4f06f12a2a6e8bd76441ca812158d70dbeec34a0215f5e9b WatchSource:0}: Error finding container d0dcb187d1279cba4f06f12a2a6e8bd76441ca812158d70dbeec34a0215f5e9b: Status 404 returned error can't find the container with id d0dcb187d1279cba4f06f12a2a6e8bd76441ca812158d70dbeec34a0215f5e9b Apr 20 22:24:33.901938 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.901903 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 22:19:31 +0000 UTC" deadline="2027-09-16 07:30:54.362159557 +0000 UTC" Apr 20 22:24:33.901938 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:33.901937 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12321h6m20.460226401s" Apr 20 22:24:34.008733 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.008707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:34.008874 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:34.008804 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:34.017197 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.017164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sjwlz" event={"ID":"c9389f21-c437-4990-a923-b0ff03e3ba21","Type":"ContainerStarted","Data":"21b9571cf228fd1ca9d7d941c138bb010c4f98c2e55e3e9400f44f11b8725aa6"} Apr 20 22:24:34.018233 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.018208 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92xbc" event={"ID":"3e440b6a-d5a8-43fe-af3d-a999f8dce281","Type":"ContainerStarted","Data":"bc9a76e8d93beb3e65b083504b5b063ef7856d3997e2d8d7419e0f79e2269dc8"} Apr 20 22:24:34.019181 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.019149 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerStarted","Data":"6b5c0ab366cc591453179076426e124ac821e1a73b2c5e744ee5113474dc0ef1"} Apr 20 22:24:34.020023 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.020005 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" event={"ID":"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231","Type":"ContainerStarted","Data":"a712b63835d4b96579ccf25183780a0a8a22435dc64e5faf8508d1f972fa419b"} Apr 20 22:24:34.021495 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.021474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal" event={"ID":"27bb6de254cf19a2989bb62d9580d525","Type":"ContainerStarted","Data":"0fdb48a9fe3a4dacdd31fc2c8048072ed260234691076b196dd677301b81b0b5"} Apr 20 22:24:34.022504 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.022486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r8nbs" event={"ID":"b6bf5cde-6108-4a7b-953c-d93acd974a19","Type":"ContainerStarted","Data":"d0dcb187d1279cba4f06f12a2a6e8bd76441ca812158d70dbeec34a0215f5e9b"} Apr 20 22:24:34.023460 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.023439 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wglz9" event={"ID":"ba8a927d-42db-4d3f-b6d1-938655219360","Type":"ContainerStarted","Data":"4b37eb33d646a292c0391fc3dc33bb763d9c9d4bd71c4dfc53ca7f6decde901f"} Apr 20 22:24:34.024350 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.024331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" event={"ID":"2f074946-73f8-4c67-9fb8-95e03ae600e5","Type":"ContainerStarted","Data":"edde5664fd6f2462c0774d017798bb613c4ec5f8e34ffb9b34b78bd7621c2353"} Apr 20 22:24:34.025366 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.025343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5m9gf" event={"ID":"f0c17cb1-e694-4fe6-8bfb-113e266578ab","Type":"ContainerStarted","Data":"9eafc797284f6c00d4aaf1f902823189961775aecf8666e864f76d0e5acaf03e"} Apr 20 22:24:34.028809 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.026569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zf7wp" event={"ID":"456ba91d-0822-42ce-a041-f73b13a803c5","Type":"ContainerStarted","Data":"9243b6aa5f30bd5cf87985b5f0b7647989c0fe81ac418dcf52c8aaa79d42ed76"} Apr 20 22:24:34.034980 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.034936 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-177.ec2.internal" podStartSLOduration=2.034925642 podStartE2EDuration="2.034925642s" podCreationTimestamp="2026-04-20 22:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:34.034868409 +0000 UTC m=+3.649024979" watchObservedRunningTime="2026-04-20 22:24:34.034925642 +0000 UTC m=+3.649082201" Apr 20 22:24:34.594300 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.594263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:34.594473 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.594313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:34.594473 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:34.594443 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:34.594473 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:34.594455 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:34.594473 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:34.594464 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kb44g for pod openshift-network-diagnostics/network-check-target-vjqq9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:34.594639 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:34.594509 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g podName:e9c331e6-87b9-45b5-9c22-016575eec846 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.594495626 +0000 UTC m=+6.208652163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kb44g" (UniqueName: "kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g") pod "network-check-target-vjqq9" (UID: "e9c331e6-87b9-45b5-9c22-016575eec846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:34.594864 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:34.594782 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:34.594864 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:34.594841 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs podName:5add223c-497e-4cc3-863e-339b6f999506 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.594823919 +0000 UTC m=+6.208980462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs") pod "network-metrics-daemon-qg2mj" (UID: "5add223c-497e-4cc3-863e-339b6f999506") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:34.799531 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:34.799469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:34.799737 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:34.799650 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:34.799737 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:34.799733 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret podName:72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.799714486 +0000 UTC m=+6.413871030 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret") pod "global-pull-secret-syncer-nh9q7" (UID: "72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33") : object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:35.009876 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:35.009841 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:35.010324 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:35.009986 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:35.010423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:35.010405 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:35.010518 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:35.010498 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:35.044224 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:35.044186 2575 generic.go:358] "Generic (PLEG): container finished" podID="0bdb7009a5728b8ce9a7775b0d8bb75e" containerID="240b249211eb5cf7ded497f51121de3f2a6b3fc6802df1e85f320bc5fb4c1b66" exitCode=0 Apr 20 22:24:35.045134 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:35.045106 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" event={"ID":"0bdb7009a5728b8ce9a7775b0d8bb75e","Type":"ContainerDied","Data":"240b249211eb5cf7ded497f51121de3f2a6b3fc6802df1e85f320bc5fb4c1b66"} Apr 20 22:24:36.010136 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:36.009457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:36.010136 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:36.009591 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:36.085877 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:36.085083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" event={"ID":"0bdb7009a5728b8ce9a7775b0d8bb75e","Type":"ContainerStarted","Data":"002f7d59abde20139cceda59339e745707d0f110145eca1b2aa307538fb5e384"} Apr 20 22:24:36.103429 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:36.103106 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-177.ec2.internal" podStartSLOduration=4.103085823 podStartE2EDuration="4.103085823s" podCreationTimestamp="2026-04-20 22:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:36.101988602 +0000 UTC m=+5.716145163" watchObservedRunningTime="2026-04-20 22:24:36.103085823 +0000 UTC m=+5.717242384" Apr 20 22:24:36.616408 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:36.615593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:36.616408 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:36.615655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:36.616408 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:36.615829 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:36.616408 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:36.615849 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:36.616408 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:36.615869 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kb44g for pod openshift-network-diagnostics/network-check-target-vjqq9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:36.616408 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:36.615928 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g podName:e9c331e6-87b9-45b5-9c22-016575eec846 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:40.615908997 +0000 UTC m=+10.230065539 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kb44g" (UniqueName: "kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g") pod "network-check-target-vjqq9" (UID: "e9c331e6-87b9-45b5-9c22-016575eec846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:36.616408 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:36.616317 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:36.616408 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:36.616368 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs podName:5add223c-497e-4cc3-863e-339b6f999506 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:40.616351002 +0000 UTC m=+10.230507546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs") pod "network-metrics-daemon-qg2mj" (UID: "5add223c-497e-4cc3-863e-339b6f999506") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:36.816958 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:36.816915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:36.817131 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:36.817096 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:36.817184 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:36.817159 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret podName:72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:40.817141158 +0000 UTC m=+10.431297710 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret") pod "global-pull-secret-syncer-nh9q7" (UID: "72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33") : object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:37.012083 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:37.011433 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:37.012083 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:37.011543 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:37.012083 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:37.011606 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:37.012083 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:37.011709 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:38.009321 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:38.008720 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:38.009321 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:38.008876 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:39.011448 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:39.011411 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:39.011892 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:39.011535 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:39.011892 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:39.011640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:39.011892 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:39.011734 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:40.009441 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:40.009405 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:40.009641 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:40.009509 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:40.650967 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:40.650922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:40.651426 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:40.650992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:40.651426 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:40.651081 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:40.651426 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:40.651128 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs podName:5add223c-497e-4cc3-863e-339b6f999506 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:48.651114531 +0000 UTC m=+18.265271068 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs") pod "network-metrics-daemon-qg2mj" (UID: "5add223c-497e-4cc3-863e-339b6f999506") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:40.651597 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:40.651447 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:40.651597 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:40.651458 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:40.651597 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:40.651467 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kb44g for pod openshift-network-diagnostics/network-check-target-vjqq9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:40.651597 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:40.651501 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g podName:e9c331e6-87b9-45b5-9c22-016575eec846 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:48.651489049 +0000 UTC m=+18.265645586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kb44g" (UniqueName: "kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g") pod "network-check-target-vjqq9" (UID: "e9c331e6-87b9-45b5-9c22-016575eec846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:40.852159 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:40.852124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:40.852337 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:40.852252 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:40.852337 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:40.852315 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret podName:72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:48.852298377 +0000 UTC m=+18.466454915 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret") pod "global-pull-secret-syncer-nh9q7" (UID: "72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33") : object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:41.010235 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:41.010155 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:41.010397 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:41.010300 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:41.010706 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:41.010685 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:41.010812 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:41.010792 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:42.009330 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:42.009282 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:42.009810 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:42.009422 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:43.009604 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:43.009029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:43.009604 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:43.009073 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:43.009604 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:43.009160 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:43.009604 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:43.009547 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:44.009086 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:44.009046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:44.009266 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:44.009169 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:45.009120 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:45.009083 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:45.009580 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:45.009212 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:45.009580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:45.009266 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:45.009580 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:45.009381 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:46.009533 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:46.009497 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:46.010002 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:46.009610 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:47.009129 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:47.009087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:47.009303 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:47.009087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:47.009303 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:47.009231 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:47.009435 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:47.009317 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:48.008757 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:48.008717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:48.009216 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:48.008850 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:48.714490 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:48.714448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:48.714490 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:48.714498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:48.714855 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:48.714620 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:48.714855 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:48.714661 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:48.714855 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:48.714690 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:48.714855 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:48.714700 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kb44g for pod openshift-network-diagnostics/network-check-target-vjqq9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:48.714855 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:48.714704 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs podName:5add223c-497e-4cc3-863e-339b6f999506 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:04.714689029 +0000 UTC m=+34.328845581 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs") pod "network-metrics-daemon-qg2mj" (UID: "5add223c-497e-4cc3-863e-339b6f999506") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:48.714855 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:48.714742 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g podName:e9c331e6-87b9-45b5-9c22-016575eec846 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:04.714727145 +0000 UTC m=+34.328883684 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kb44g" (UniqueName: "kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g") pod "network-check-target-vjqq9" (UID: "e9c331e6-87b9-45b5-9c22-016575eec846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:48.916152 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:48.916101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:48.916302 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:48.916227 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:48.916302 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:48.916291 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret podName:72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:04.916277672 +0000 UTC m=+34.530434208 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret") pod "global-pull-secret-syncer-nh9q7" (UID: "72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33") : object "kube-system"/"original-pull-secret" not registered Apr 20 22:24:49.009888 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:49.009373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:49.009888 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:49.009493 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:49.009888 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:49.009604 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:49.009888 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:49.009722 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:50.009003 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:50.008961 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:50.009200 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:50.009090 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:51.010103 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:51.010062 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:51.010460 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:51.010179 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:51.010460 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:51.010258 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:51.010460 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:51.010404 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:52.009401 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.009079 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:52.009560 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:52.009486 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:52.111742 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.111712 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e440b6a-d5a8-43fe-af3d-a999f8dce281" containerID="96fa725a857485131855adcc34f966be2d97cf9025aa737905d830efa73e7064" exitCode=0 Apr 20 22:24:52.112459 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.111795 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92xbc" event={"ID":"3e440b6a-d5a8-43fe-af3d-a999f8dce281","Type":"ContainerDied","Data":"96fa725a857485131855adcc34f966be2d97cf9025aa737905d830efa73e7064"} Apr 20 22:24:52.114374 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.114353 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:24:52.114666 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.114648 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c237e12-2748-4be2-8f88-258e6064ea33" containerID="7c78f0f77feac247218c371f8423177ad222b486a0899b9b5b96d29d4449aef8" exitCode=1 Apr 20 22:24:52.114755 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.114709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerStarted","Data":"c3b6532a8519ee540a6d8996cd40ccb6c94ede0b9cd63e93ed2b2e287f011f4c"} Apr 20 22:24:52.114755 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.114747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerStarted","Data":"f20c9a7fab0ed2a15851e9992247ef27749d3daeeb048259b948e2ad6a9fe3d5"} Apr 20 22:24:52.114848 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.114763 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerStarted","Data":"3c294ba61a6468a5a941ae4fd3611ed6dac728ef5de6cb28d930434bcb41a59a"} Apr 20 22:24:52.114848 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.114776 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerDied","Data":"7c78f0f77feac247218c371f8423177ad222b486a0899b9b5b96d29d4449aef8"} Apr 20 22:24:52.114848 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.114795 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerStarted","Data":"3896bbd3a3367ba92ee0b0d9257c5b1c8f4c5414e673d7ae0ca96f3bf496819a"} Apr 20 22:24:52.115978 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.115949 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" event={"ID":"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231","Type":"ContainerStarted","Data":"3422b6e92e413dd525ab1acd33ab312a0b8b3c20dc5cd982cf82267eca766b2b"} Apr 20 22:24:52.117011 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.116985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wglz9" event={"ID":"ba8a927d-42db-4d3f-b6d1-938655219360","Type":"ContainerStarted","Data":"dbd48df9dc5806c65edcbdfc5fd017d5cdad87b2532ca9c2dc8f219e723ac5fd"} Apr 20 22:24:52.118036 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.118006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" event={"ID":"2f074946-73f8-4c67-9fb8-95e03ae600e5","Type":"ContainerStarted","Data":"879cb4ba6ca0392ddb6a963ac7a3c1bc60b5e50c50ceaeebd8d1c9a0d2d34eef"} Apr 20 22:24:52.119252 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.119233 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5m9gf" event={"ID":"f0c17cb1-e694-4fe6-8bfb-113e266578ab","Type":"ContainerStarted","Data":"d8bd19566725866e04f532521147d8035b954df18a85eca69944cf10b88fa027"} Apr 20 22:24:52.120445 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.120425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zf7wp" event={"ID":"456ba91d-0822-42ce-a041-f73b13a803c5","Type":"ContainerStarted","Data":"89d7af5f8eec110bfa2055d1d51cde8d9cc04cad0f59a9195e8cf3bdc035f6ae"} Apr 20 22:24:52.121480 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.121463 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sjwlz" event={"ID":"c9389f21-c437-4990-a923-b0ff03e3ba21","Type":"ContainerStarted","Data":"be47a7f8141535b43b350586dba10b5d9afab899beba62bdcfe9e35d123b2a82"} Apr 20 22:24:52.153052 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.153004 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5m9gf" podStartSLOduration=3.678785536 podStartE2EDuration="21.152987545s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:33.801240753 +0000 UTC m=+3.415397289" lastFinishedPulling="2026-04-20 22:24:51.275442757 +0000 UTC m=+20.889599298" observedRunningTime="2026-04-20 22:24:52.152690184 +0000 UTC m=+21.766846743" watchObservedRunningTime="2026-04-20 22:24:52.152987545 +0000 UTC m=+21.767144104" Apr 20 22:24:52.168048 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.168000 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s8vzk" podStartSLOduration=3.870431238 podStartE2EDuration="21.167979767s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:33.80647645 +0000 UTC m=+3.420632987" lastFinishedPulling="2026-04-20 22:24:51.104024963 +0000 UTC m=+20.718181516" observedRunningTime="2026-04-20 22:24:52.167696488 +0000 UTC m=+21.781853040" watchObservedRunningTime="2026-04-20 22:24:52.167979767 +0000 UTC m=+21.782136338" Apr 20 22:24:52.170936 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.170908 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:52.171538 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.171515 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:52.181087 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.181028 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wglz9" podStartSLOduration=3.883407964 podStartE2EDuration="21.181012438s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:33.806424041 +0000 UTC m=+3.420580584" lastFinishedPulling="2026-04-20 22:24:51.104028521 +0000 UTC m=+20.718185058" observedRunningTime="2026-04-20 22:24:52.180888888 +0000 UTC m=+21.795045448" watchObservedRunningTime="2026-04-20 22:24:52.181012438 +0000 UTC m=+21.795169001" Apr 20 22:24:52.193592 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.193541 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zf7wp" podStartSLOduration=3.892544646 podStartE2EDuration="21.193523744s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:33.830772267 +0000 UTC m=+3.444928820" lastFinishedPulling="2026-04-20 22:24:51.131751378 +0000 UTC m=+20.745907918" observedRunningTime="2026-04-20 22:24:52.193264517 +0000 UTC m=+21.807421080" watchObservedRunningTime="2026-04-20 22:24:52.193523744 +0000 UTC m=+21.807680304" Apr 20 22:24:52.207335 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.207286 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sjwlz" podStartSLOduration=3.779308126 podStartE2EDuration="21.207266705s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:33.83141093 +0000 UTC m=+3.445567470" lastFinishedPulling="2026-04-20 22:24:51.259369512 +0000 UTC m=+20.873526049" observedRunningTime="2026-04-20 22:24:52.206590109 +0000 UTC m=+21.820746669" watchObservedRunningTime="2026-04-20 22:24:52.207266705 +0000 UTC m=+21.821423264" Apr 20 22:24:52.878393 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.878145 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 22:24:52.938654 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.938530 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T22:24:52.878383643Z","UUID":"adeddc57-c8c4-4b9f-a0cc-cd09ab82358e","Handler":null,"Name":"","Endpoint":""} Apr 20 22:24:52.941890 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.941793 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 22:24:52.941890 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:52.941827 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 22:24:53.009061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:53.008969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:53.009239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:53.008969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:53.009239 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:53.009099 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:53.009239 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:53.009189 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:53.126328 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:53.126303 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:24:53.126822 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:53.126709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerStarted","Data":"7fa4238f4a908b93d922a919218377dab9f6a1fee134f8f5035dd28f1a8c4c80"} Apr 20 22:24:53.128528 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:53.128504 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" event={"ID":"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231","Type":"ContainerStarted","Data":"4aa18f62e6f249aed8f23b45f013cdb10a6c639bd88a49e9d669cea5765637f0"} Apr 20 22:24:53.129981 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:53.129955 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r8nbs" event={"ID":"b6bf5cde-6108-4a7b-953c-d93acd974a19","Type":"ContainerStarted","Data":"66bad4ebe2cae112220887e28541f4c30b959a984c044a8898b0532d1be26146"} Apr 20 22:24:53.131159 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:53.131125 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:53.131247 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:53.131178 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wglz9" Apr 20 22:24:53.144166 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:53.144116 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-r8nbs" podStartSLOduration=4.699688395 podStartE2EDuration="22.144097844s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:33.831098533 +0000 UTC m=+3.445255072" lastFinishedPulling="2026-04-20 22:24:51.275507972 +0000 UTC m=+20.889664521" observedRunningTime="2026-04-20 22:24:53.144084688 +0000 UTC m=+22.758241247" watchObservedRunningTime="2026-04-20 22:24:53.144097844 +0000 UTC m=+22.758254404" Apr 20 22:24:54.008989 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:54.008956 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:54.009200 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:54.009089 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:55.009514 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:55.009478 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:55.010276 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:55.009615 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:55.010276 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:55.009701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:55.010276 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:55.009812 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:55.136539 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:55.136512 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:24:55.136917 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:55.136892 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerStarted","Data":"dd5f9e57368b8699f7a00c534cd0e085fe5d13b2f4f82b9994f9cfdb641ed463"} Apr 20 22:24:55.139143 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:55.139110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" event={"ID":"b80f6f39-9e21-4ca3-a8d5-cc8ee7b04231","Type":"ContainerStarted","Data":"65fc0c463bd3bc949fe45a64e76303c525e2052822288b9fe4d9dc1077e02214"} Apr 20 22:24:55.156977 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:55.156925 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4chjq" podStartSLOduration=3.751133942 podStartE2EDuration="24.156907553s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:33.798123325 +0000 UTC m=+3.412279862" lastFinishedPulling="2026-04-20 22:24:54.203896925 +0000 UTC m=+23.818053473" observedRunningTime="2026-04-20 22:24:55.155074825 +0000 UTC m=+24.769231384" watchObservedRunningTime="2026-04-20 22:24:55.156907553 +0000 UTC m=+24.771064111" Apr 20 22:24:56.008733 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:56.008691 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:56.008925 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:56.008816 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:57.009267 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:57.009029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:57.009698 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:57.009088 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:57.009698 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:57.009295 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:57.009698 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:57.009415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:57.144299 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:57.144264 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e440b6a-d5a8-43fe-af3d-a999f8dce281" containerID="dbb7749245b2cfdda8a3feba8c84969edb7811cc066feec29a5d68587a5d6d1e" exitCode=0 Apr 20 22:24:57.144490 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:57.144358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92xbc" event={"ID":"3e440b6a-d5a8-43fe-af3d-a999f8dce281","Type":"ContainerDied","Data":"dbb7749245b2cfdda8a3feba8c84969edb7811cc066feec29a5d68587a5d6d1e"} Apr 20 22:24:57.147484 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:57.147467 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:24:57.147862 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:57.147843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerStarted","Data":"3811cd4c21da143ffe449aa287790903e788de7bc8c34d9c74e40157c0f7630d"} Apr 20 22:24:57.148181 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:57.148167 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:57.148414 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:57.148396 2575 scope.go:117] "RemoveContainer" containerID="7c78f0f77feac247218c371f8423177ad222b486a0899b9b5b96d29d4449aef8" Apr 20 22:24:57.166042 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:57.166019 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:58.009446 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.009412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:58.009823 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:58.009547 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:58.152412 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.152370 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e440b6a-d5a8-43fe-af3d-a999f8dce281" containerID="dd7e53412649d3fee93e102291cfa42ac61dd57398ef39afc6c7b279b709c7ca" exitCode=0 Apr 20 22:24:58.152583 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.152432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92xbc" event={"ID":"3e440b6a-d5a8-43fe-af3d-a999f8dce281","Type":"ContainerDied","Data":"dd7e53412649d3fee93e102291cfa42ac61dd57398ef39afc6c7b279b709c7ca"} Apr 20 22:24:58.156184 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.156159 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:24:58.156583 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.156545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" event={"ID":"2c237e12-2748-4be2-8f88-258e6064ea33","Type":"ContainerStarted","Data":"2acb2ec864c35ed961a9a4f518be692e3f3b337ab799812517ca7c6a976f5d5e"} Apr 20 22:24:58.156720 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.156707 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 22:24:58.156979 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.156954 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:58.173424 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.173395 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:24:58.200216 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.200150 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" podStartSLOduration=9.686583468 podStartE2EDuration="27.200129187s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:33.804420403 +0000 UTC m=+3.418576941" lastFinishedPulling="2026-04-20 22:24:51.317966121 +0000 UTC m=+20.932122660" observedRunningTime="2026-04-20 22:24:58.1985615 +0000 UTC m=+27.812718060" watchObservedRunningTime="2026-04-20 22:24:58.200129187 +0000 UTC m=+27.814285747" Apr 20 22:24:58.480688 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.480446 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nh9q7"] Apr 20 22:24:58.480875 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.480820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:24:58.480978 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:58.480937 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:24:58.483549 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.483525 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qg2mj"] Apr 20 22:24:58.483651 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.483633 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:24:58.483757 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:58.483740 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:24:58.486565 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.486543 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vjqq9"] Apr 20 22:24:58.486666 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:58.486629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:24:58.486769 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:24:58.486749 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:24:59.160612 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:59.160583 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e440b6a-d5a8-43fe-af3d-a999f8dce281" containerID="d76f4b47a20b2f0c2e650fce1616c9dd8da186ce666cc20eadc98b454743e34d" exitCode=0 Apr 20 22:24:59.161082 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:59.160653 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92xbc" event={"ID":"3e440b6a-d5a8-43fe-af3d-a999f8dce281","Type":"ContainerDied","Data":"d76f4b47a20b2f0c2e650fce1616c9dd8da186ce666cc20eadc98b454743e34d"} Apr 20 22:24:59.161082 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:59.160820 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 22:24:59.251284 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:24:59.251244 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:25:00.008986 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:00.008930 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:25:00.009193 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:00.008948 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:25:00.009193 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:00.009058 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:25:00.009193 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:00.009059 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:25:00.009193 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:00.009163 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:25:00.009365 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:00.009259 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:25:01.178284 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:01.178232 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" podUID="2c237e12-2748-4be2-8f88-258e6064ea33" containerName="ovnkube-controller" probeResult="failure" output="" Apr 20 22:25:02.009374 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:02.009331 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:25:02.009374 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:02.009361 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:25:02.009639 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:02.009332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:25:02.009639 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:02.009456 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:25:02.009639 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:02.009539 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:25:02.009849 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:02.009666 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:25:04.009353 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.009319 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:25:04.009353 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.009360 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:25:04.009897 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.009376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:25:04.009897 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.009448 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vjqq9" podUID="e9c331e6-87b9-45b5-9c22-016575eec846" Apr 20 22:25:04.009897 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.009576 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qg2mj" podUID="5add223c-497e-4cc3-863e-339b6f999506" Apr 20 22:25:04.009897 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.009654 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nh9q7" podUID="72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33" Apr 20 22:25:04.211421 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.211021 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-177.ec2.internal" event="NodeReady" Apr 20 22:25:04.211421 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.211197 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 22:25:04.245070 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.245030 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-598d4bbdbc-hc4q7"] Apr 20 22:25:04.275995 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.275916 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7bc89"] Apr 20 22:25:04.276170 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.276116 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.279206 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.278991 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6fksq\"" Apr 20 22:25:04.279364 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.279345 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 22:25:04.279715 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.279696 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 22:25:04.279715 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.279697 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 22:25:04.284583 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.284561 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 22:25:04.291022 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.290999 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-598d4bbdbc-hc4q7"] Apr 20 22:25:04.291022 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.291025 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-87rj9"] Apr 20 22:25:04.291190 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.291163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.293613 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.293592 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 22:25:04.293749 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.293614 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bl4mg\"" Apr 20 22:25:04.293749 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.293633 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 22:25:04.307321 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.307295 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7bc89"] Apr 20 22:25:04.307321 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.307330 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-87rj9"] Apr 20 22:25:04.307532 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.307343 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lbljv"] Apr 20 22:25:04.308006 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.307983 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-87rj9" Apr 20 22:25:04.310446 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.310426 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 22:25:04.310555 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.310452 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv5w5\"" Apr 20 22:25:04.310782 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.310759 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 22:25:04.311234 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.311215 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 22:25:04.333016 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.332988 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lbljv"] Apr 20 22:25:04.333180 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.333157 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.336319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.335909 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 22:25:04.336319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.336014 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 22:25:04.336319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.336152 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 22:25:04.336319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.335918 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 22:25:04.336319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.336311 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rsxbh\"" Apr 20 22:25:04.432809 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.432756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-crio-socket\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.432809 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.432816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/238c0ea5-4742-4d5c-b685-8c4aab704f3c-tmp-dir\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.433049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.432841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lqnv\" (UniqueName: \"kubernetes.io/projected/238c0ea5-4742-4d5c-b685-8c4aab704f3c-kube-api-access-5lqnv\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.433049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.432900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-registry-certificates\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.433049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.432926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/238c0ea5-4742-4d5c-b685-8c4aab704f3c-metrics-tls\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.433049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.432954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-image-registry-private-configuration\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.433049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.432974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-bound-sa-token\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.433049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-trusted-ca\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.433049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq6qt\" (UniqueName: \"kubernetes.io/projected/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-kube-api-access-sq6qt\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-data-volume\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frpp6\" (UniqueName: \"kubernetes.io/projected/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-kube-api-access-frpp6\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-installation-pull-secrets\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433142 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-registry-tls\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3473a30-a4b9-4d21-9b2f-83594665ed99-cert\") pod \"ingress-canary-87rj9\" (UID: \"c3473a30-a4b9-4d21-9b2f-83594665ed99\") " pod="openshift-ingress-canary/ingress-canary-87rj9" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/238c0ea5-4742-4d5c-b685-8c4aab704f3c-config-volume\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wmzj\" (UniqueName: \"kubernetes.io/projected/c3473a30-a4b9-4d21-9b2f-83594665ed99-kube-api-access-4wmzj\") pod \"ingress-canary-87rj9\" (UID: \"c3473a30-a4b9-4d21-9b2f-83594665ed99\") " pod="openshift-ingress-canary/ingress-canary-87rj9" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-ca-trust-extracted\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.433400 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.433297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.534742 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-registry-tls\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.534742 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3473a30-a4b9-4d21-9b2f-83594665ed99-cert\") pod \"ingress-canary-87rj9\" (UID: \"c3473a30-a4b9-4d21-9b2f-83594665ed99\") " pod="openshift-ingress-canary/ingress-canary-87rj9" Apr 20 22:25:04.534742 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/238c0ea5-4742-4d5c-b685-8c4aab704f3c-config-volume\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.534742 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wmzj\" (UniqueName: \"kubernetes.io/projected/c3473a30-a4b9-4d21-9b2f-83594665ed99-kube-api-access-4wmzj\") pod \"ingress-canary-87rj9\" (UID: \"c3473a30-a4b9-4d21-9b2f-83594665ed99\") " pod="openshift-ingress-canary/ingress-canary-87rj9" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-ca-trust-extracted\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-crio-socket\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/238c0ea5-4742-4d5c-b685-8c4aab704f3c-tmp-dir\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lqnv\" (UniqueName: \"kubernetes.io/projected/238c0ea5-4742-4d5c-b685-8c4aab704f3c-kube-api-access-5lqnv\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-registry-certificates\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/238c0ea5-4742-4d5c-b685-8c4aab704f3c-metrics-tls\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-image-registry-private-configuration\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.534992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-bound-sa-token\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-trusted-ca\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sq6qt\" (UniqueName: \"kubernetes.io/projected/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-kube-api-access-sq6qt\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.535069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-data-volume\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.535637 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frpp6\" (UniqueName: \"kubernetes.io/projected/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-kube-api-access-frpp6\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.535637 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-installation-pull-secrets\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.535637 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.535637 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/238c0ea5-4742-4d5c-b685-8c4aab704f3c-config-volume\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.535637 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/238c0ea5-4742-4d5c-b685-8c4aab704f3c-tmp-dir\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.535877 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535639 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-ca-trust-extracted\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.535877 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-crio-socket\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.535877 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.535809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.536443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.536417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-data-volume\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.536569 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.536533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-registry-certificates\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.537117 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.537046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-trusted-ca\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.539642 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.539616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.539642 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.539629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/238c0ea5-4742-4d5c-b685-8c4aab704f3c-metrics-tls\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.541363 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.541342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-registry-tls\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.543413 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.542988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3473a30-a4b9-4d21-9b2f-83594665ed99-cert\") pod \"ingress-canary-87rj9\" (UID: \"c3473a30-a4b9-4d21-9b2f-83594665ed99\") " pod="openshift-ingress-canary/ingress-canary-87rj9" Apr 20 22:25:04.543413 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.543326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-installation-pull-secrets\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.543566 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.543546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-image-registry-private-configuration\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.544087 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.544063 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wmzj\" (UniqueName: \"kubernetes.io/projected/c3473a30-a4b9-4d21-9b2f-83594665ed99-kube-api-access-4wmzj\") pod \"ingress-canary-87rj9\" (UID: \"c3473a30-a4b9-4d21-9b2f-83594665ed99\") " pod="openshift-ingress-canary/ingress-canary-87rj9" Apr 20 22:25:04.544909 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.544887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-bound-sa-token\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.545102 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.545086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lqnv\" (UniqueName: \"kubernetes.io/projected/238c0ea5-4742-4d5c-b685-8c4aab704f3c-kube-api-access-5lqnv\") pod \"dns-default-7bc89\" (UID: \"238c0ea5-4742-4d5c-b685-8c4aab704f3c\") " pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.545563 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.545520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq6qt\" (UniqueName: \"kubernetes.io/projected/eadeda0e-6eb7-49ca-aa1c-dd1002554f51-kube-api-access-sq6qt\") pod \"image-registry-598d4bbdbc-hc4q7\" (UID: \"eadeda0e-6eb7-49ca-aa1c-dd1002554f51\") " pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.546740 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.546714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frpp6\" (UniqueName: \"kubernetes.io/projected/814cbee0-89a6-4755-8d2c-bb2ca9cb16d0-kube-api-access-frpp6\") pod \"insights-runtime-extractor-lbljv\" (UID: \"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0\") " pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.590824 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.590786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:04.601777 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.601745 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:04.618520 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.618487 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-87rj9" Apr 20 22:25:04.644484 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.644453 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lbljv" Apr 20 22:25:04.737642 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.737604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:25:04.737642 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.737649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:25:04.737888 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.737782 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:25:04.737888 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.737863 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs podName:5add223c-497e-4cc3-863e-339b6f999506 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:36.737847987 +0000 UTC m=+66.352004523 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs") pod "network-metrics-daemon-qg2mj" (UID: "5add223c-497e-4cc3-863e-339b6f999506") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:25:04.737888 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.737786 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:25:04.737991 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.737890 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:25:04.737991 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.737900 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kb44g for pod openshift-network-diagnostics/network-check-target-vjqq9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:25:04.737991 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.737953 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g podName:e9c331e6-87b9-45b5-9c22-016575eec846 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:36.737939453 +0000 UTC m=+66.352095990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kb44g" (UniqueName: "kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g") pod "network-check-target-vjqq9" (UID: "e9c331e6-87b9-45b5-9c22-016575eec846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:25:04.939207 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:04.939173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:25:04.939395 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.939335 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 22:25:04.939395 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:04.939384 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret podName:72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:36.939371489 +0000 UTC m=+66.553528026 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret") pod "global-pull-secret-syncer-nh9q7" (UID: "72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33") : object "kube-system"/"original-pull-secret" not registered Apr 20 22:25:05.002045 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.002009 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-598d4bbdbc-hc4q7"] Apr 20 22:25:05.004379 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.004356 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7bc89"] Apr 20 22:25:05.007234 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.007209 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lbljv"] Apr 20 22:25:05.008224 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:05.008197 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeadeda0e_6eb7_49ca_aa1c_dd1002554f51.slice/crio-6f9ac534f9930a6015c2b1c63edffc84b850bd24decd14be2f1e10edd29bb3c3 WatchSource:0}: Error finding container 6f9ac534f9930a6015c2b1c63edffc84b850bd24decd14be2f1e10edd29bb3c3: Status 404 returned error can't find the container with id 6f9ac534f9930a6015c2b1c63edffc84b850bd24decd14be2f1e10edd29bb3c3 Apr 20 22:25:05.008829 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:05.008793 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod238c0ea5_4742_4d5c_b685_8c4aab704f3c.slice/crio-98644efe78007246b6aa1a6cdd31482c72eea8af860027ac6575b2289922c4d5 WatchSource:0}: Error finding container 98644efe78007246b6aa1a6cdd31482c72eea8af860027ac6575b2289922c4d5: Status 404 returned error can't find the container with id 98644efe78007246b6aa1a6cdd31482c72eea8af860027ac6575b2289922c4d5 Apr 20 22:25:05.010846 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:05.010823 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod814cbee0_89a6_4755_8d2c_bb2ca9cb16d0.slice/crio-dab13995fc114e5145b6e90d5f7e3af131bd1cedb56c28af3350311b4240e149 WatchSource:0}: Error finding container dab13995fc114e5145b6e90d5f7e3af131bd1cedb56c28af3350311b4240e149: Status 404 returned error can't find the container with id dab13995fc114e5145b6e90d5f7e3af131bd1cedb56c28af3350311b4240e149 Apr 20 22:25:05.015390 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.015372 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-87rj9"] Apr 20 22:25:05.017830 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:05.017808 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3473a30_a4b9_4d21_9b2f_83594665ed99.slice/crio-a584c9ddb2514f7a5050910c5f5ff2c2c0e5ee7a9c1bfa43be36856cd054c369 WatchSource:0}: Error finding container a584c9ddb2514f7a5050910c5f5ff2c2c0e5ee7a9c1bfa43be36856cd054c369: Status 404 returned error can't find the container with id a584c9ddb2514f7a5050910c5f5ff2c2c0e5ee7a9c1bfa43be36856cd054c369 Apr 20 22:25:05.176740 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.176480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92xbc" event={"ID":"3e440b6a-d5a8-43fe-af3d-a999f8dce281","Type":"ContainerStarted","Data":"7e2ff2ab4b6e9113130e07ac0e991da264d773ab464536ce6a7a4f754d627889"} Apr 20 22:25:05.177524 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.177497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7bc89" event={"ID":"238c0ea5-4742-4d5c-b685-8c4aab704f3c","Type":"ContainerStarted","Data":"98644efe78007246b6aa1a6cdd31482c72eea8af860027ac6575b2289922c4d5"} Apr 20 22:25:05.179160 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.179135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lbljv" event={"ID":"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0","Type":"ContainerStarted","Data":"ca5473b39bb7901b73a3d9eb05d8cd9e0dd8f2f3e9df59d8f3e4c8ad07903d99"} Apr 20 22:25:05.179272 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.179164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lbljv" event={"ID":"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0","Type":"ContainerStarted","Data":"dab13995fc114e5145b6e90d5f7e3af131bd1cedb56c28af3350311b4240e149"} Apr 20 22:25:05.180328 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.180307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" event={"ID":"eadeda0e-6eb7-49ca-aa1c-dd1002554f51","Type":"ContainerStarted","Data":"74d68151edfb63c058f0a512b86ad77b9a932fb8e9d8be6c2ce73b8eb5d7ecd5"} Apr 20 22:25:05.180420 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.180335 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" event={"ID":"eadeda0e-6eb7-49ca-aa1c-dd1002554f51","Type":"ContainerStarted","Data":"6f9ac534f9930a6015c2b1c63edffc84b850bd24decd14be2f1e10edd29bb3c3"} Apr 20 22:25:05.180772 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.180753 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:05.181854 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.181829 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-87rj9" event={"ID":"c3473a30-a4b9-4d21-9b2f-83594665ed99","Type":"ContainerStarted","Data":"a584c9ddb2514f7a5050910c5f5ff2c2c0e5ee7a9c1bfa43be36856cd054c369"} Apr 20 22:25:05.198634 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:05.198589 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" podStartSLOduration=9.198573055 podStartE2EDuration="9.198573055s" podCreationTimestamp="2026-04-20 22:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:25:05.19826154 +0000 UTC m=+34.812418123" watchObservedRunningTime="2026-04-20 22:25:05.198573055 +0000 UTC m=+34.812729614" Apr 20 22:25:06.009155 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.009113 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:25:06.009155 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.009145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:25:06.009520 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.009228 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:25:06.013508 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.013485 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 22:25:06.013508 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.013504 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 22:25:06.014216 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.013553 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 22:25:06.014216 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.013483 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 22:25:06.014216 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.013697 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddtmq\"" Apr 20 22:25:06.014216 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.013892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-66fzs\"" Apr 20 22:25:06.186754 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.186719 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e440b6a-d5a8-43fe-af3d-a999f8dce281" containerID="7e2ff2ab4b6e9113130e07ac0e991da264d773ab464536ce6a7a4f754d627889" exitCode=0 Apr 20 22:25:06.186925 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.186800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92xbc" event={"ID":"3e440b6a-d5a8-43fe-af3d-a999f8dce281","Type":"ContainerDied","Data":"7e2ff2ab4b6e9113130e07ac0e991da264d773ab464536ce6a7a4f754d627889"} Apr 20 22:25:06.486519 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.486493 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xhqjt"] Apr 20 22:25:06.510884 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.510855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.513592 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.513544 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 22:25:06.513768 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.513588 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 22:25:06.513768 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.513600 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5mrl8\"" Apr 20 22:25:06.514936 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.514715 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 22:25:06.514936 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.514734 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 22:25:06.514936 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.514758 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 22:25:06.514936 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.514715 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 22:25:06.655471 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.655439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-textfile\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.655471 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.655473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwh6d\" (UniqueName: \"kubernetes.io/projected/eaddf140-0247-4d4a-8283-7ad9403b4507-kube-api-access-bwh6d\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.655663 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.655540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.655663 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.655581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eaddf140-0247-4d4a-8283-7ad9403b4507-metrics-client-ca\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.655663 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.655610 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-accelerators-collector-config\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.655663 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.655631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-tls\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.655816 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.655689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-wtmp\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.655816 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.655736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eaddf140-0247-4d4a-8283-7ad9403b4507-root\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.655816 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.655773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eaddf140-0247-4d4a-8283-7ad9403b4507-sys\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.756998 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.756911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-wtmp\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.756998 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.756946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eaddf140-0247-4d4a-8283-7ad9403b4507-root\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.756998 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.756977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eaddf140-0247-4d4a-8283-7ad9403b4507-sys\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-textfile\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwh6d\" (UniqueName: \"kubernetes.io/projected/eaddf140-0247-4d4a-8283-7ad9403b4507-kube-api-access-bwh6d\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eaddf140-0247-4d4a-8283-7ad9403b4507-root\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757062 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eaddf140-0247-4d4a-8283-7ad9403b4507-sys\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eaddf140-0247-4d4a-8283-7ad9403b4507-metrics-client-ca\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-wtmp\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-accelerators-collector-config\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-tls\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757660 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-textfile\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757797 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eaddf140-0247-4d4a-8283-7ad9403b4507-metrics-client-ca\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.757980 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.757808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-accelerators-collector-config\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.761106 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.761083 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.761213 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.761111 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eaddf140-0247-4d4a-8283-7ad9403b4507-node-exporter-tls\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.770238 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.770210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwh6d\" (UniqueName: \"kubernetes.io/projected/eaddf140-0247-4d4a-8283-7ad9403b4507-kube-api-access-bwh6d\") pod \"node-exporter-xhqjt\" (UID: \"eaddf140-0247-4d4a-8283-7ad9403b4507\") " pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:06.822369 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:06.822332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xhqjt" Apr 20 22:25:07.500318 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.500268 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 22:25:07.517257 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.517230 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.518990 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.518963 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 22:25:07.519945 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.519926 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 22:25:07.520048 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.519957 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-jrvbn\"" Apr 20 22:25:07.520048 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.519957 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 22:25:07.520170 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.520153 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 22:25:07.520438 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.520423 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 22:25:07.520508 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.520461 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 22:25:07.521218 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.521200 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 22:25:07.521332 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.521243 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 22:25:07.521332 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.521305 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 22:25:07.521332 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.521316 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 22:25:07.544249 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:07.544225 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaddf140_0247_4d4a_8283_7ad9403b4507.slice/crio-e5478f6c254c4c93bf3e36d8bc31683c455056a1be043b3b5e55821322168aca WatchSource:0}: Error finding container e5478f6c254c4c93bf3e36d8bc31683c455056a1be043b3b5e55821322168aca: Status 404 returned error can't find the container with id e5478f6c254c4c93bf3e36d8bc31683c455056a1be043b3b5e55821322168aca Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.666269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.666546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.666773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-config-out\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.666832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-config-volume\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.666871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.666914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.666948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.666997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmbcx\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-kube-api-access-fmbcx\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.667026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.667067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667950 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.667131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667950 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.667732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-web-config\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.667950 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.667777 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.768590 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-web-config\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.768792 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.768792 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.768792 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.768792 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-config-out\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.768792 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-config-volume\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.768792 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.769101 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.769101 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.769101 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmbcx\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-kube-api-access-fmbcx\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.769101 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.769101 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.769101 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.768967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.771231 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:07.770360 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 22:25:07.771231 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:07.770471 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-main-tls podName:beabc605-6cf4-451c-86cd-7292fa88598a nodeName:}" failed. No retries permitted until 2026-04-20 22:25:08.27045159 +0000 UTC m=+37.884608132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a") : secret "alertmanager-main-tls" not found Apr 20 22:25:07.772386 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.771542 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.772386 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.772333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.774024 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.773467 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-config-out\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.774024 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.773984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-web-config\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.774024 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.774011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.774215 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.774138 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-config-volume\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.777087 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.775619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.777087 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.777046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.778563 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.777748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.779423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.779396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.779630 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.779597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmbcx\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-kube-api-access-fmbcx\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:07.782783 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:07.782761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:08.193474 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.193432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xhqjt" event={"ID":"eaddf140-0247-4d4a-8283-7ad9403b4507","Type":"ContainerStarted","Data":"e5478f6c254c4c93bf3e36d8bc31683c455056a1be043b3b5e55821322168aca"} Apr 20 22:25:08.194596 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.194574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-87rj9" event={"ID":"c3473a30-a4b9-4d21-9b2f-83594665ed99","Type":"ContainerStarted","Data":"bcd7a554290ba562de9b3bea1ff6aa9c46a02abe3402e5c2c06d41b9f3e69e89"} Apr 20 22:25:08.197074 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.197049 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e440b6a-d5a8-43fe-af3d-a999f8dce281" containerID="a7da88a938de76d6b319ddee409f5cc2b98ac04616542be185ab4821c45eebf5" exitCode=0 Apr 20 22:25:08.197163 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.197127 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92xbc" event={"ID":"3e440b6a-d5a8-43fe-af3d-a999f8dce281","Type":"ContainerDied","Data":"a7da88a938de76d6b319ddee409f5cc2b98ac04616542be185ab4821c45eebf5"} Apr 20 22:25:08.198793 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.198771 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7bc89" event={"ID":"238c0ea5-4742-4d5c-b685-8c4aab704f3c","Type":"ContainerStarted","Data":"1cb31fed24299dd4c25cbc3006d46b9b1fd3a9f5019d0418007ede79755a3c1c"} Apr 20 22:25:08.198793 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.198795 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7bc89" event={"ID":"238c0ea5-4742-4d5c-b685-8c4aab704f3c","Type":"ContainerStarted","Data":"90e0ec4a26cab31869dd68bb3f4bed04e0a90a616fbb708704b6589eca9abd09"} Apr 20 22:25:08.198959 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.198895 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:08.200236 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.200216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lbljv" event={"ID":"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0","Type":"ContainerStarted","Data":"60f3932e4857098e5c3d9029c06cceb0680760c13b57d41e0aea619bc6ec1641"} Apr 20 22:25:08.209338 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.209287 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-87rj9" podStartSLOduration=1.681926453 podStartE2EDuration="4.209270676s" podCreationTimestamp="2026-04-20 22:25:04 +0000 UTC" firstStartedPulling="2026-04-20 22:25:05.019441753 +0000 UTC m=+34.633598290" lastFinishedPulling="2026-04-20 22:25:07.546785977 +0000 UTC m=+37.160942513" observedRunningTime="2026-04-20 22:25:08.208227747 +0000 UTC m=+37.822384303" watchObservedRunningTime="2026-04-20 22:25:08.209270676 +0000 UTC m=+37.823427234" Apr 20 22:25:08.228049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.227993 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7bc89" podStartSLOduration=1.6953729480000002 podStartE2EDuration="4.227977368s" podCreationTimestamp="2026-04-20 22:25:04 +0000 UTC" firstStartedPulling="2026-04-20 22:25:05.010842042 +0000 UTC m=+34.624998585" lastFinishedPulling="2026-04-20 22:25:07.543446457 +0000 UTC m=+37.157603005" observedRunningTime="2026-04-20 22:25:08.227331549 +0000 UTC m=+37.841488110" watchObservedRunningTime="2026-04-20 22:25:08.227977368 +0000 UTC m=+37.842133927" Apr 20 22:25:08.272787 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.272702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:08.276006 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.275978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:08.428635 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.428608 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:08.523655 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.523390 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f98f74c7-hrxtq"] Apr 20 22:25:08.540892 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.540865 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.541998 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.541738 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f98f74c7-hrxtq"] Apr 20 22:25:08.545991 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.545308 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 22:25:08.545991 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.545376 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 22:25:08.545991 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.545388 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 22:25:08.545991 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.545309 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 22:25:08.545991 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.545588 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-dn9dg\"" Apr 20 22:25:08.545991 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.545738 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 22:25:08.545991 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.545787 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 22:25:08.545991 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.545738 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 22:25:08.593318 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.593288 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 22:25:08.653918 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:08.653881 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeabc605_6cf4_451c_86cd_7292fa88598a.slice/crio-abda5429871f10fcf6f7ffefc88823c36be0de40590790edd2a427f4ce850de7 WatchSource:0}: Error finding container abda5429871f10fcf6f7ffefc88823c36be0de40590790edd2a427f4ce850de7: Status 404 returned error can't find the container with id abda5429871f10fcf6f7ffefc88823c36be0de40590790edd2a427f4ce850de7 Apr 20 22:25:08.679957 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.679932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-service-ca\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.680065 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.679999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-console-config\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.680167 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.680131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-serving-cert\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.680224 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.680191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-oauth-config\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.680224 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.680213 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-oauth-serving-cert\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.680305 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.680227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2dxb\" (UniqueName: \"kubernetes.io/projected/4a2982e8-c491-4937-9fa8-45c127a464a6-kube-api-access-x2dxb\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.780749 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.780656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-serving-cert\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.780915 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.780749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-oauth-config\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.780915 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.780802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-oauth-serving-cert\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.780915 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.780826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dxb\" (UniqueName: \"kubernetes.io/projected/4a2982e8-c491-4937-9fa8-45c127a464a6-kube-api-access-x2dxb\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.780915 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.780852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-service-ca\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.780915 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.780907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-console-config\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.781602 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.781573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-oauth-serving-cert\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.781602 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.781588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-service-ca\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.781779 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.781591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-console-config\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.784226 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.784204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-oauth-config\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.784305 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.784251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-serving-cert\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.788412 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.788392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2dxb\" (UniqueName: \"kubernetes.io/projected/4a2982e8-c491-4937-9fa8-45c127a464a6-kube-api-access-x2dxb\") pod \"console-5f98f74c7-hrxtq\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:08.853877 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:08.853839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:09.208614 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:09.208572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92xbc" event={"ID":"3e440b6a-d5a8-43fe-af3d-a999f8dce281","Type":"ContainerStarted","Data":"4aad3e033d81124c2dbc1d252ffb513b21bb7d4d3f3d3a6d1055c4178db45ba5"} Apr 20 22:25:09.210242 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:09.210210 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xhqjt" event={"ID":"eaddf140-0247-4d4a-8283-7ad9403b4507","Type":"ContainerStarted","Data":"b87623c043dbb77e33483ce2d2a49881ea70d983cc214144cc7f7a2cfa3915b3"} Apr 20 22:25:09.211387 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:09.211363 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerStarted","Data":"abda5429871f10fcf6f7ffefc88823c36be0de40590790edd2a427f4ce850de7"} Apr 20 22:25:09.230508 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:09.230457 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-92xbc" podStartSLOduration=7.037390252 podStartE2EDuration="38.230441521s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:33.830986537 +0000 UTC m=+3.445143079" lastFinishedPulling="2026-04-20 22:25:05.024037796 +0000 UTC m=+34.638194348" observedRunningTime="2026-04-20 22:25:09.22886071 +0000 UTC m=+38.843017270" watchObservedRunningTime="2026-04-20 22:25:09.230441521 +0000 UTC m=+38.844598081" Apr 20 22:25:09.445981 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:09.445802 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f98f74c7-hrxtq"] Apr 20 22:25:09.449127 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:09.449095 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2982e8_c491_4937_9fa8_45c127a464a6.slice/crio-3f5bb7507126c5474cff77d676501910f8af1838c99f5c13776c162e663fa3bc WatchSource:0}: Error finding container 3f5bb7507126c5474cff77d676501910f8af1838c99f5c13776c162e663fa3bc: Status 404 returned error can't find the container with id 3f5bb7507126c5474cff77d676501910f8af1838c99f5c13776c162e663fa3bc Apr 20 22:25:10.218182 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:10.218120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lbljv" event={"ID":"814cbee0-89a6-4755-8d2c-bb2ca9cb16d0","Type":"ContainerStarted","Data":"0c0535b5397fea258e23a88a8c40e5109dea0dcbedc632a5faa3f7b14f35fd1b"} Apr 20 22:25:10.220053 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:10.220015 2575 generic.go:358] "Generic (PLEG): container finished" podID="eaddf140-0247-4d4a-8283-7ad9403b4507" containerID="b87623c043dbb77e33483ce2d2a49881ea70d983cc214144cc7f7a2cfa3915b3" exitCode=0 Apr 20 22:25:10.220342 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:10.220310 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xhqjt" event={"ID":"eaddf140-0247-4d4a-8283-7ad9403b4507","Type":"ContainerDied","Data":"b87623c043dbb77e33483ce2d2a49881ea70d983cc214144cc7f7a2cfa3915b3"} Apr 20 22:25:10.221717 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:10.221608 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f98f74c7-hrxtq" event={"ID":"4a2982e8-c491-4937-9fa8-45c127a464a6","Type":"ContainerStarted","Data":"3f5bb7507126c5474cff77d676501910f8af1838c99f5c13776c162e663fa3bc"} Apr 20 22:25:10.239618 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:10.239558 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lbljv" podStartSLOduration=1.708011624 podStartE2EDuration="6.239538904s" podCreationTimestamp="2026-04-20 22:25:04 +0000 UTC" firstStartedPulling="2026-04-20 22:25:05.10728338 +0000 UTC m=+34.721439925" lastFinishedPulling="2026-04-20 22:25:09.638810668 +0000 UTC m=+39.252967205" observedRunningTime="2026-04-20 22:25:10.238426175 +0000 UTC m=+39.852582739" watchObservedRunningTime="2026-04-20 22:25:10.239538904 +0000 UTC m=+39.853695461" Apr 20 22:25:11.226944 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:11.226909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xhqjt" event={"ID":"eaddf140-0247-4d4a-8283-7ad9403b4507","Type":"ContainerStarted","Data":"5f6938dccb739461c491b51ac92c9cbfc3b80bb8d1ba86e3472466e19349518b"} Apr 20 22:25:11.226944 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:11.226951 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xhqjt" event={"ID":"eaddf140-0247-4d4a-8283-7ad9403b4507","Type":"ContainerStarted","Data":"e10323a073fda60e985cec3c1d0ecfe7ad30622d3174e2e803196db4150497b4"} Apr 20 22:25:11.228637 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:11.228606 2575 generic.go:358] "Generic (PLEG): container finished" podID="beabc605-6cf4-451c-86cd-7292fa88598a" containerID="fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be" exitCode=0 Apr 20 22:25:11.228796 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:11.228707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerDied","Data":"fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be"} Apr 20 22:25:11.280239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:11.280189 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xhqjt" podStartSLOduration=4.152388683 podStartE2EDuration="5.280171226s" podCreationTimestamp="2026-04-20 22:25:06 +0000 UTC" firstStartedPulling="2026-04-20 22:25:07.545979432 +0000 UTC m=+37.160135984" lastFinishedPulling="2026-04-20 22:25:08.673761989 +0000 UTC m=+38.287918527" observedRunningTime="2026-04-20 22:25:11.251884783 +0000 UTC m=+40.866041347" watchObservedRunningTime="2026-04-20 22:25:11.280171226 +0000 UTC m=+40.894327786" Apr 20 22:25:13.240263 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:13.240217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f98f74c7-hrxtq" event={"ID":"4a2982e8-c491-4937-9fa8-45c127a464a6","Type":"ContainerStarted","Data":"06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0"} Apr 20 22:25:13.256402 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:13.256353 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f98f74c7-hrxtq" podStartSLOduration=1.604448514 podStartE2EDuration="5.256338618s" podCreationTimestamp="2026-04-20 22:25:08 +0000 UTC" firstStartedPulling="2026-04-20 22:25:09.451070118 +0000 UTC m=+39.065226655" lastFinishedPulling="2026-04-20 22:25:13.102960222 +0000 UTC m=+42.717116759" observedRunningTime="2026-04-20 22:25:13.255657328 +0000 UTC m=+42.869813888" watchObservedRunningTime="2026-04-20 22:25:13.256338618 +0000 UTC m=+42.870495177" Apr 20 22:25:14.246803 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:14.246769 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerStarted","Data":"27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91"} Apr 20 22:25:14.247121 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:14.246811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerStarted","Data":"9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0"} Apr 20 22:25:14.247121 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:14.246826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerStarted","Data":"40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41"} Apr 20 22:25:14.247121 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:14.246839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerStarted","Data":"db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9"} Apr 20 22:25:15.252380 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:15.252338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerStarted","Data":"1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933"} Apr 20 22:25:15.252380 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:15.252375 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerStarted","Data":"9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93"} Apr 20 22:25:15.280151 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:15.280048 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.855563353 podStartE2EDuration="8.280027826s" podCreationTimestamp="2026-04-20 22:25:07 +0000 UTC" firstStartedPulling="2026-04-20 22:25:08.670507043 +0000 UTC m=+38.284663580" lastFinishedPulling="2026-04-20 22:25:15.094971501 +0000 UTC m=+44.709128053" observedRunningTime="2026-04-20 22:25:15.277949247 +0000 UTC m=+44.892105807" watchObservedRunningTime="2026-04-20 22:25:15.280027826 +0000 UTC m=+44.894184387" Apr 20 22:25:18.214222 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:18.214189 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7bc89" Apr 20 22:25:18.854863 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:18.854824 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:18.855115 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:18.854910 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:18.859567 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:18.859545 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:19.267500 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.267424 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:19.548431 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.548347 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fcf78567d-c4bs7"] Apr 20 22:25:19.583909 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.583861 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fcf78567d-c4bs7"] Apr 20 22:25:19.584064 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.583991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.592236 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.592209 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 22:25:19.681563 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.681520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhtbl\" (UniqueName: \"kubernetes.io/projected/0ac0422c-e979-48ff-90cc-ffc86f66b903-kube-api-access-lhtbl\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.681563 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.681570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-oauth-serving-cert\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.681792 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.681643 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-oauth-config\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.681792 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.681741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-serving-cert\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.681904 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.681796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-config\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.681904 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.681813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-service-ca\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.681904 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.681829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-trusted-ca-bundle\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.782323 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.782279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-serving-cert\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.782520 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.782344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-config\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.782520 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.782361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-service-ca\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.782520 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.782377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-trusted-ca-bundle\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.782520 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.782432 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhtbl\" (UniqueName: \"kubernetes.io/projected/0ac0422c-e979-48ff-90cc-ffc86f66b903-kube-api-access-lhtbl\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.782520 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.782455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-oauth-serving-cert\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.782771 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.782552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-oauth-config\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.783231 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.783209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-config\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.783321 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.783256 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-oauth-serving-cert\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.783413 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.783396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-trusted-ca-bundle\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.783454 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.783397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-service-ca\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.785058 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.785033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-serving-cert\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.785146 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.785074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-oauth-config\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.790618 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.790597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhtbl\" (UniqueName: \"kubernetes.io/projected/0ac0422c-e979-48ff-90cc-ffc86f66b903-kube-api-access-lhtbl\") pod \"console-6fcf78567d-c4bs7\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:19.893387 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:19.893347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:20.017007 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:20.016976 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fcf78567d-c4bs7"] Apr 20 22:25:20.020259 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:20.020216 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ac0422c_e979_48ff_90cc_ffc86f66b903.slice/crio-da4bd1808d0ec33015392f39badcb01d24856ed83cc8b1fc7c956324da27d83b WatchSource:0}: Error finding container da4bd1808d0ec33015392f39badcb01d24856ed83cc8b1fc7c956324da27d83b: Status 404 returned error can't find the container with id da4bd1808d0ec33015392f39badcb01d24856ed83cc8b1fc7c956324da27d83b Apr 20 22:25:20.268397 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:20.268359 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fcf78567d-c4bs7" event={"ID":"0ac0422c-e979-48ff-90cc-ffc86f66b903","Type":"ContainerStarted","Data":"c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805"} Apr 20 22:25:20.268397 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:20.268395 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fcf78567d-c4bs7" event={"ID":"0ac0422c-e979-48ff-90cc-ffc86f66b903","Type":"ContainerStarted","Data":"da4bd1808d0ec33015392f39badcb01d24856ed83cc8b1fc7c956324da27d83b"} Apr 20 22:25:20.284158 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:20.284111 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fcf78567d-c4bs7" podStartSLOduration=1.284094421 podStartE2EDuration="1.284094421s" podCreationTimestamp="2026-04-20 22:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:25:20.283320647 +0000 UTC m=+49.897477209" watchObservedRunningTime="2026-04-20 22:25:20.284094421 +0000 UTC m=+49.898250980" Apr 20 22:25:26.190899 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:26.190869 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-598d4bbdbc-hc4q7" Apr 20 22:25:29.894398 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:29.894356 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:29.894398 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:29.894408 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:29.899511 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:29.899483 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:30.301948 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:30.301866 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:25:30.351941 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:30.351901 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f98f74c7-hrxtq"] Apr 20 22:25:31.176398 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:31.176368 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rp7bw" Apr 20 22:25:36.833347 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.833299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:25:36.833827 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.833392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:25:36.836073 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.836053 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 22:25:36.837028 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.837012 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 22:25:36.846369 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.846334 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 22:25:36.846520 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.846498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5add223c-497e-4cc3-863e-339b6f999506-metrics-certs\") pod \"network-metrics-daemon-qg2mj\" (UID: \"5add223c-497e-4cc3-863e-339b6f999506\") " pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:25:36.856808 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.856775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb44g\" (UniqueName: \"kubernetes.io/projected/e9c331e6-87b9-45b5-9c22-016575eec846-kube-api-access-kb44g\") pod \"network-check-target-vjqq9\" (UID: \"e9c331e6-87b9-45b5-9c22-016575eec846\") " pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:25:36.926257 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.926225 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-66fzs\"" Apr 20 22:25:36.933106 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.933080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:25:36.933367 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.933348 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddtmq\"" Apr 20 22:25:36.940656 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:36.940631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qg2mj" Apr 20 22:25:37.034289 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:37.034223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:25:37.036948 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:37.036926 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 22:25:37.047942 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:37.047904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33-original-pull-secret\") pod \"global-pull-secret-syncer-nh9q7\" (UID: \"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33\") " pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:25:37.071142 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:37.071113 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vjqq9"] Apr 20 22:25:37.086793 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:37.086724 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qg2mj"] Apr 20 22:25:37.089764 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:37.089738 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5add223c_497e_4cc3_863e_339b6f999506.slice/crio-6b462cd4451f7266bc3a3dd3286e1c1215e33e8878ecc27d685d8d9b490f22a9 WatchSource:0}: Error finding container 6b462cd4451f7266bc3a3dd3286e1c1215e33e8878ecc27d685d8d9b490f22a9: Status 404 returned error can't find the container with id 6b462cd4451f7266bc3a3dd3286e1c1215e33e8878ecc27d685d8d9b490f22a9 Apr 20 22:25:37.237994 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:37.237954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nh9q7" Apr 20 22:25:37.320822 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:37.320785 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qg2mj" event={"ID":"5add223c-497e-4cc3-863e-339b6f999506","Type":"ContainerStarted","Data":"6b462cd4451f7266bc3a3dd3286e1c1215e33e8878ecc27d685d8d9b490f22a9"} Apr 20 22:25:37.321984 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:37.321957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vjqq9" event={"ID":"e9c331e6-87b9-45b5-9c22-016575eec846","Type":"ContainerStarted","Data":"c355cf1831cce25447f8a1b4ed3c7fc1626222482b2cbb9b16fa1e5240286410"} Apr 20 22:25:37.374504 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:37.374468 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nh9q7"] Apr 20 22:25:37.377634 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:37.377605 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72dc52a4_8f63_4b8f_be9f_1e2b2cf7ab33.slice/crio-4c35d434a74c19d8c815419b6466d12a5d52047c4260646f807a580a85b6e76a WatchSource:0}: Error finding container 4c35d434a74c19d8c815419b6466d12a5d52047c4260646f807a580a85b6e76a: Status 404 returned error can't find the container with id 4c35d434a74c19d8c815419b6466d12a5d52047c4260646f807a580a85b6e76a Apr 20 22:25:38.329008 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:38.328961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nh9q7" event={"ID":"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33","Type":"ContainerStarted","Data":"4c35d434a74c19d8c815419b6466d12a5d52047c4260646f807a580a85b6e76a"} Apr 20 22:25:39.334447 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:39.334398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qg2mj" event={"ID":"5add223c-497e-4cc3-863e-339b6f999506","Type":"ContainerStarted","Data":"2a07cbdbb71c07b092a7ae324ed058634692eec9fc07b682d2a33b62e73db539"} Apr 20 22:25:39.334447 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:39.334445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qg2mj" event={"ID":"5add223c-497e-4cc3-863e-339b6f999506","Type":"ContainerStarted","Data":"584a28c3aa051ee3edffa621ddd45ae9ea5acae83fb3c7953de929636b2bfe4b"} Apr 20 22:25:39.351102 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:39.351042 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qg2mj" podStartSLOduration=66.962980707 podStartE2EDuration="1m8.351020371s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:25:37.09147708 +0000 UTC m=+66.705633616" lastFinishedPulling="2026-04-20 22:25:38.479516732 +0000 UTC m=+68.093673280" observedRunningTime="2026-04-20 22:25:39.350042497 +0000 UTC m=+68.964199109" watchObservedRunningTime="2026-04-20 22:25:39.351020371 +0000 UTC m=+68.965176932" Apr 20 22:25:41.343027 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:41.342978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vjqq9" event={"ID":"e9c331e6-87b9-45b5-9c22-016575eec846","Type":"ContainerStarted","Data":"28438a63d40844c59826041c95642c98c3479e98c2b0476e9cb90d04ee2afe7a"} Apr 20 22:25:41.343463 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:41.343253 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:25:41.358930 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:41.358865 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vjqq9" podStartSLOduration=67.034759285 podStartE2EDuration="1m10.358846092s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:25:37.076299395 +0000 UTC m=+66.690455946" lastFinishedPulling="2026-04-20 22:25:40.400386213 +0000 UTC m=+70.014542753" observedRunningTime="2026-04-20 22:25:41.357319582 +0000 UTC m=+70.971476177" watchObservedRunningTime="2026-04-20 22:25:41.358846092 +0000 UTC m=+70.973002650" Apr 20 22:25:42.347038 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:42.346944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nh9q7" event={"ID":"72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33","Type":"ContainerStarted","Data":"51c39e7623a3814b4e9b0a82088b7f0c0641a4776e0cfe0ce1563d2cd3effba9"} Apr 20 22:25:42.361870 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:42.361819 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nh9q7" podStartSLOduration=64.770892058 podStartE2EDuration="1m9.361801366s" podCreationTimestamp="2026-04-20 22:24:33 +0000 UTC" firstStartedPulling="2026-04-20 22:25:37.379422583 +0000 UTC m=+66.993579121" lastFinishedPulling="2026-04-20 22:25:41.970331888 +0000 UTC m=+71.584488429" observedRunningTime="2026-04-20 22:25:42.360637272 +0000 UTC m=+71.974793831" watchObservedRunningTime="2026-04-20 22:25:42.361801366 +0000 UTC m=+71.975957924" Apr 20 22:25:55.372653 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.372471 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f98f74c7-hrxtq" podUID="4a2982e8-c491-4937-9fa8-45c127a464a6" containerName="console" containerID="cri-o://06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0" gracePeriod=15 Apr 20 22:25:55.615453 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.615430 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f98f74c7-hrxtq_4a2982e8-c491-4937-9fa8-45c127a464a6/console/0.log" Apr 20 22:25:55.615594 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.615507 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:55.790692 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.790578 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-serving-cert\") pod \"4a2982e8-c491-4937-9fa8-45c127a464a6\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " Apr 20 22:25:55.790692 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.790622 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-oauth-serving-cert\") pod \"4a2982e8-c491-4937-9fa8-45c127a464a6\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " Apr 20 22:25:55.790921 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.790713 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-service-ca\") pod \"4a2982e8-c491-4937-9fa8-45c127a464a6\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " Apr 20 22:25:55.790921 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.790750 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-console-config\") pod \"4a2982e8-c491-4937-9fa8-45c127a464a6\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " Apr 20 22:25:55.790921 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.790769 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-oauth-config\") pod \"4a2982e8-c491-4937-9fa8-45c127a464a6\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " Apr 20 22:25:55.790921 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.790805 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2dxb\" (UniqueName: \"kubernetes.io/projected/4a2982e8-c491-4937-9fa8-45c127a464a6-kube-api-access-x2dxb\") pod \"4a2982e8-c491-4937-9fa8-45c127a464a6\" (UID: \"4a2982e8-c491-4937-9fa8-45c127a464a6\") " Apr 20 22:25:55.791170 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.791140 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4a2982e8-c491-4937-9fa8-45c127a464a6" (UID: "4a2982e8-c491-4937-9fa8-45c127a464a6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:55.791251 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.791222 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-service-ca" (OuterVolumeSpecName: "service-ca") pod "4a2982e8-c491-4937-9fa8-45c127a464a6" (UID: "4a2982e8-c491-4937-9fa8-45c127a464a6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:55.791299 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.791229 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-console-config" (OuterVolumeSpecName: "console-config") pod "4a2982e8-c491-4937-9fa8-45c127a464a6" (UID: "4a2982e8-c491-4937-9fa8-45c127a464a6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:55.793185 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.793154 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4a2982e8-c491-4937-9fa8-45c127a464a6" (UID: "4a2982e8-c491-4937-9fa8-45c127a464a6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:55.793559 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.793531 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4a2982e8-c491-4937-9fa8-45c127a464a6" (UID: "4a2982e8-c491-4937-9fa8-45c127a464a6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:55.793645 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.793579 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2982e8-c491-4937-9fa8-45c127a464a6-kube-api-access-x2dxb" (OuterVolumeSpecName: "kube-api-access-x2dxb") pod "4a2982e8-c491-4937-9fa8-45c127a464a6" (UID: "4a2982e8-c491-4937-9fa8-45c127a464a6"). InnerVolumeSpecName "kube-api-access-x2dxb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:55.892221 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.892164 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-console-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:55.892221 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.892214 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-oauth-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:55.892221 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.892226 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x2dxb\" (UniqueName: \"kubernetes.io/projected/4a2982e8-c491-4937-9fa8-45c127a464a6-kube-api-access-x2dxb\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:55.892221 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.892236 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a2982e8-c491-4937-9fa8-45c127a464a6-console-serving-cert\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:55.892483 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.892247 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-oauth-serving-cert\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:55.892483 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:55.892255 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a2982e8-c491-4937-9fa8-45c127a464a6-service-ca\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:56.386574 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.386545 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f98f74c7-hrxtq_4a2982e8-c491-4937-9fa8-45c127a464a6/console/0.log" Apr 20 22:25:56.387001 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.386586 2575 generic.go:358] "Generic (PLEG): container finished" podID="4a2982e8-c491-4937-9fa8-45c127a464a6" containerID="06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0" exitCode=2 Apr 20 22:25:56.387001 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.386656 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f98f74c7-hrxtq" Apr 20 22:25:56.387001 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.386659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f98f74c7-hrxtq" event={"ID":"4a2982e8-c491-4937-9fa8-45c127a464a6","Type":"ContainerDied","Data":"06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0"} Apr 20 22:25:56.387001 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.386756 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f98f74c7-hrxtq" event={"ID":"4a2982e8-c491-4937-9fa8-45c127a464a6","Type":"ContainerDied","Data":"3f5bb7507126c5474cff77d676501910f8af1838c99f5c13776c162e663fa3bc"} Apr 20 22:25:56.387001 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.386771 2575 scope.go:117] "RemoveContainer" containerID="06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0" Apr 20 22:25:56.395503 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.395484 2575 scope.go:117] "RemoveContainer" containerID="06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0" Apr 20 22:25:56.395821 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:56.395800 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0\": container with ID starting with 06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0 not found: ID does not exist" containerID="06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0" Apr 20 22:25:56.395887 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.395829 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0"} err="failed to get container status \"06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0\": rpc error: code = NotFound desc = could not find container \"06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0\": container with ID starting with 06e4020e3243dc7b79ffcb5bd4df58496d8c366ae9a4f957a063595bda125ad0 not found: ID does not exist" Apr 20 22:25:56.406981 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.406944 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f98f74c7-hrxtq"] Apr 20 22:25:56.411662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.411638 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f98f74c7-hrxtq"] Apr 20 22:25:56.662182 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.662086 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 22:25:56.662617 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.662591 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="alertmanager" containerID="cri-o://db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9" gracePeriod=120 Apr 20 22:25:56.662718 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.662688 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy-metric" containerID="cri-o://9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93" gracePeriod=120 Apr 20 22:25:56.662783 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.662723 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy" containerID="cri-o://27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91" gracePeriod=120 Apr 20 22:25:56.662783 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.662714 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="config-reloader" containerID="cri-o://40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41" gracePeriod=120 Apr 20 22:25:56.662783 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.662750 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="prom-label-proxy" containerID="cri-o://1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933" gracePeriod=120 Apr 20 22:25:56.662783 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:56.662700 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy-web" containerID="cri-o://9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0" gracePeriod=120 Apr 20 22:25:57.013172 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.013080 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2982e8-c491-4937-9fa8-45c127a464a6" path="/var/lib/kubelet/pods/4a2982e8-c491-4937-9fa8-45c127a464a6/volumes" Apr 20 22:25:57.394069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394035 2575 generic.go:358] "Generic (PLEG): container finished" podID="beabc605-6cf4-451c-86cd-7292fa88598a" containerID="1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933" exitCode=0 Apr 20 22:25:57.394069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394060 2575 generic.go:358] "Generic (PLEG): container finished" podID="beabc605-6cf4-451c-86cd-7292fa88598a" containerID="9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93" exitCode=0 Apr 20 22:25:57.394069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394069 2575 generic.go:358] "Generic (PLEG): container finished" podID="beabc605-6cf4-451c-86cd-7292fa88598a" containerID="27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91" exitCode=0 Apr 20 22:25:57.394069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394077 2575 generic.go:358] "Generic (PLEG): container finished" podID="beabc605-6cf4-451c-86cd-7292fa88598a" containerID="40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41" exitCode=0 Apr 20 22:25:57.394603 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394085 2575 generic.go:358] "Generic (PLEG): container finished" podID="beabc605-6cf4-451c-86cd-7292fa88598a" containerID="db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9" exitCode=0 Apr 20 22:25:57.394603 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerDied","Data":"1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933"} Apr 20 22:25:57.394603 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerDied","Data":"9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93"} Apr 20 22:25:57.394603 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394152 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerDied","Data":"27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91"} Apr 20 22:25:57.394603 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerDied","Data":"40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41"} Apr 20 22:25:57.394603 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.394170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerDied","Data":"db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9"} Apr 20 22:25:57.898483 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:57.898456 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.007737 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.007630 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-trusted-ca-bundle\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.007737 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.007695 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.007933 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.007747 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmbcx\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-kube-api-access-fmbcx\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.007933 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.007793 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-config-volume\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.007933 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.007911 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-main-tls\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.008086 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.007967 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-tls-assets\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.008086 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008001 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-main-db\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.008086 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008032 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-web-config\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.008086 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008035 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:58.008086 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008068 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-cluster-tls-config\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.008323 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008112 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-metrics-client-ca\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.008323 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008141 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-web\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.008323 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008177 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-config-out\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.008323 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008212 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"beabc605-6cf4-451c-86cd-7292fa88598a\" (UID: \"beabc605-6cf4-451c-86cd-7292fa88598a\") " Apr 20 22:25:58.008509 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008392 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:25:58.008509 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008417 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.008968 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.008940 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:58.011443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.011400 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:58.012081 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.012043 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:58.012191 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.012073 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:58.012243 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.012202 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:58.012304 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.012285 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-kube-api-access-fmbcx" (OuterVolumeSpecName: "kube-api-access-fmbcx") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "kube-api-access-fmbcx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:58.012378 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.012360 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:58.012536 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.012512 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-config-volume" (OuterVolumeSpecName: "config-volume") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:58.013042 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.013018 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-config-out" (OuterVolumeSpecName: "config-out") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:25:58.016289 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.016263 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:58.021359 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.021209 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-web-config" (OuterVolumeSpecName: "web-config") pod "beabc605-6cf4-451c-86cd-7292fa88598a" (UID: "beabc605-6cf4-451c-86cd-7292fa88598a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:58.109627 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109569 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-config-volume\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109627 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109619 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-main-tls\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109627 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109630 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-tls-assets\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109627 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109640 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-alertmanager-main-db\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109627 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109649 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-web-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109937 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109657 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-cluster-tls-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109937 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109666 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/beabc605-6cf4-451c-86cd-7292fa88598a-metrics-client-ca\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109937 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109703 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109937 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109717 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/beabc605-6cf4-451c-86cd-7292fa88598a-config-out\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109937 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109731 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109937 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109740 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/beabc605-6cf4-451c-86cd-7292fa88598a-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.109937 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.109749 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmbcx\" (UniqueName: \"kubernetes.io/projected/beabc605-6cf4-451c-86cd-7292fa88598a-kube-api-access-fmbcx\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:25:58.400822 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.400784 2575 generic.go:358] "Generic (PLEG): container finished" podID="beabc605-6cf4-451c-86cd-7292fa88598a" containerID="9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0" exitCode=0 Apr 20 22:25:58.401241 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.400871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerDied","Data":"9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0"} Apr 20 22:25:58.401241 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.400916 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"beabc605-6cf4-451c-86cd-7292fa88598a","Type":"ContainerDied","Data":"abda5429871f10fcf6f7ffefc88823c36be0de40590790edd2a427f4ce850de7"} Apr 20 22:25:58.401241 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.400933 2575 scope.go:117] "RemoveContainer" containerID="1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933" Apr 20 22:25:58.401241 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.400941 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.408598 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.408577 2575 scope.go:117] "RemoveContainer" containerID="9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93" Apr 20 22:25:58.415503 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.415486 2575 scope.go:117] "RemoveContainer" containerID="27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91" Apr 20 22:25:58.421951 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.421927 2575 scope.go:117] "RemoveContainer" containerID="9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0" Apr 20 22:25:58.423855 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.423833 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 22:25:58.428453 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.428430 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 22:25:58.429808 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.429795 2575 scope.go:117] "RemoveContainer" containerID="40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41" Apr 20 22:25:58.436655 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.436634 2575 scope.go:117] "RemoveContainer" containerID="db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9" Apr 20 22:25:58.443565 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.443536 2575 scope.go:117] "RemoveContainer" containerID="fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be" Apr 20 22:25:58.450503 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.450477 2575 scope.go:117] "RemoveContainer" containerID="1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933" Apr 20 22:25:58.450874 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:58.450847 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933\": container with ID starting with 1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933 not found: ID does not exist" containerID="1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933" Apr 20 22:25:58.450974 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.450887 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933"} err="failed to get container status \"1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933\": rpc error: code = NotFound desc = could not find container \"1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933\": container with ID starting with 1a16dd153514eca8baba81b3d53b0ec48edc061198914d0602275eb58d5ac933 not found: ID does not exist" Apr 20 22:25:58.450974 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.450914 2575 scope.go:117] "RemoveContainer" containerID="9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93" Apr 20 22:25:58.451085 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451065 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 22:25:58.451182 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:58.451162 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93\": container with ID starting with 9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93 not found: ID does not exist" containerID="9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93" Apr 20 22:25:58.451226 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451189 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93"} err="failed to get container status \"9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93\": rpc error: code = NotFound desc = could not find container \"9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93\": container with ID starting with 9276d1240587724f1b8121064db04ade7a9c8aed4c3a2ddd51a88138b8601d93 not found: ID does not exist" Apr 20 22:25:58.451226 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451210 2575 scope.go:117] "RemoveContainer" containerID="27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91" Apr 20 22:25:58.451339 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451323 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy" Apr 20 22:25:58.451392 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451342 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy" Apr 20 22:25:58.451392 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451353 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="init-config-reloader" Apr 20 22:25:58.451392 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451361 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="init-config-reloader" Apr 20 22:25:58.451392 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451371 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="alertmanager" Apr 20 22:25:58.451392 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451379 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="alertmanager" Apr 20 22:25:58.451392 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451387 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy-web" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451396 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy-web" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451409 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy-metric" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451419 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy-metric" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451428 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="prom-label-proxy" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451436 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="prom-label-proxy" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451448 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a2982e8-c491-4937-9fa8-45c127a464a6" containerName="console" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451454 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2982e8-c491-4937-9fa8-45c127a464a6" containerName="console" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451465 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="config-reloader" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:58.451460 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91\": container with ID starting with 27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91 not found: ID does not exist" containerID="27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451485 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91"} err="failed to get container status \"27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91\": rpc error: code = NotFound desc = could not find container \"27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91\": container with ID starting with 27ef74e6596e858af22d3b10736439e62a60c6be431cd425a5c76e314e7f2e91 not found: ID does not exist" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451503 2575 scope.go:117] "RemoveContainer" containerID="9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0" Apr 20 22:25:58.451580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451469 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="config-reloader" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451608 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="alertmanager" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451619 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="config-reloader" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451630 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451640 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a2982e8-c491-4937-9fa8-45c127a464a6" containerName="console" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451651 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy-web" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451661 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="kube-rbac-proxy-metric" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451695 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" containerName="prom-label-proxy" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:58.451775 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0\": container with ID starting with 9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0 not found: ID does not exist" containerID="9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451801 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0"} err="failed to get container status \"9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0\": rpc error: code = NotFound desc = could not find container \"9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0\": container with ID starting with 9c78bfa6dc48f8cbd0ce861830f6d4a6042e2a793b3d269b09420104200450e0 not found: ID does not exist" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.451818 2575 scope.go:117] "RemoveContainer" containerID="40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:58.452059 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41\": container with ID starting with 40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41 not found: ID does not exist" containerID="40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.452079 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41"} err="failed to get container status \"40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41\": rpc error: code = NotFound desc = could not find container \"40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41\": container with ID starting with 40765d261b9de35496a8db9ad0b1b589c0260ed3d74c04220e42db0d73b3fb41 not found: ID does not exist" Apr 20 22:25:58.452092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.452098 2575 scope.go:117] "RemoveContainer" containerID="db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9" Apr 20 22:25:58.452577 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:58.452324 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9\": container with ID starting with db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9 not found: ID does not exist" containerID="db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9" Apr 20 22:25:58.452577 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.452343 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9"} err="failed to get container status \"db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9\": rpc error: code = NotFound desc = could not find container \"db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9\": container with ID starting with db962996d195a4c789bc7db8070e9496a92da18000618bd77d7f32f7b70841c9 not found: ID does not exist" Apr 20 22:25:58.452577 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.452358 2575 scope.go:117] "RemoveContainer" containerID="fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be" Apr 20 22:25:58.452577 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:25:58.452553 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be\": container with ID starting with fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be not found: ID does not exist" containerID="fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be" Apr 20 22:25:58.452577 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.452571 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be"} err="failed to get container status \"fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be\": rpc error: code = NotFound desc = could not find container \"fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be\": container with ID starting with fd34273e1b980cd570ee4c27385dd69db91d1eaadc86493bea2097c6dfb8e5be not found: ID does not exist" Apr 20 22:25:58.456620 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.456605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.459001 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.458981 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 22:25:58.459173 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.459156 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 22:25:58.459244 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.459159 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 22:25:58.459403 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.459383 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 22:25:58.459507 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.459441 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 22:25:58.459507 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.459447 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 22:25:58.459507 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.459484 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 22:25:58.459507 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.459475 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 22:25:58.459744 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.459603 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-jrvbn\"" Apr 20 22:25:58.464270 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.464252 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 22:25:58.466637 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.466615 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 22:25:58.614589 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.614589 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.614839 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614614 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-config-volume\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.614839 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-config-out\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.614839 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.614839 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614738 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.614839 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.614839 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.614839 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614827 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-web-config\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.615113 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.615113 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pll5t\" (UniqueName: \"kubernetes.io/projected/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-kube-api-access-pll5t\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.615113 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.615113 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.614982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.715954 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.715870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.715954 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.715912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.715954 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.715938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716178 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.715965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-config-volume\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716178 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.715991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-config-out\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716178 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716178 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716178 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716178 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716178 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-web-config\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716459 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716206 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716459 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pll5t\" (UniqueName: \"kubernetes.io/projected/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-kube-api-access-pll5t\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716459 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716459 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.716782 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.716752 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.717424 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.717399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.719750 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.719381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.719750 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.719553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.719750 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.719610 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.719750 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.719709 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.719750 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.719720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.719949 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.719901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-config-volume\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.720141 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.720120 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-web-config\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.720633 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.720612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.720731 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.720653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-config-out\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.724853 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.724834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pll5t\" (UniqueName: \"kubernetes.io/projected/54ee3a71-33df-46b8-9cf2-3ba929fdd80b-kube-api-access-pll5t\") pod \"alertmanager-main-0\" (UID: \"54ee3a71-33df-46b8-9cf2-3ba929fdd80b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.766802 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.766768 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 22:25:58.891711 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:58.891653 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 22:25:58.893856 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:25:58.893827 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54ee3a71_33df_46b8_9cf2_3ba929fdd80b.slice/crio-31312ec9c17fc4a99ffcb90cfc5715e280d81e44e991785eaca4147c7944ee10 WatchSource:0}: Error finding container 31312ec9c17fc4a99ffcb90cfc5715e280d81e44e991785eaca4147c7944ee10: Status 404 returned error can't find the container with id 31312ec9c17fc4a99ffcb90cfc5715e280d81e44e991785eaca4147c7944ee10 Apr 20 22:25:59.013063 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:59.013025 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beabc605-6cf4-451c-86cd-7292fa88598a" path="/var/lib/kubelet/pods/beabc605-6cf4-451c-86cd-7292fa88598a/volumes" Apr 20 22:25:59.404598 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:59.404562 2575 generic.go:358] "Generic (PLEG): container finished" podID="54ee3a71-33df-46b8-9cf2-3ba929fdd80b" containerID="ce33e22d7ad54515b30ff32cb116e896100b07499a8a7dc0ade5db18f102d008" exitCode=0 Apr 20 22:25:59.405068 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:59.404650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"54ee3a71-33df-46b8-9cf2-3ba929fdd80b","Type":"ContainerDied","Data":"ce33e22d7ad54515b30ff32cb116e896100b07499a8a7dc0ade5db18f102d008"} Apr 20 22:25:59.405068 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:25:59.404712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"54ee3a71-33df-46b8-9cf2-3ba929fdd80b","Type":"ContainerStarted","Data":"31312ec9c17fc4a99ffcb90cfc5715e280d81e44e991785eaca4147c7944ee10"} Apr 20 22:26:00.411168 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.411131 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"54ee3a71-33df-46b8-9cf2-3ba929fdd80b","Type":"ContainerStarted","Data":"8adf75bf9e070a6d058960842da2e3c98fe1d1e40706177854492ed5930fa2f8"} Apr 20 22:26:00.411168 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.411171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"54ee3a71-33df-46b8-9cf2-3ba929fdd80b","Type":"ContainerStarted","Data":"16dc336eae0830312137620e1f2307d9ac423740b0120ae1c28bb70d50760724"} Apr 20 22:26:00.411660 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.411187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"54ee3a71-33df-46b8-9cf2-3ba929fdd80b","Type":"ContainerStarted","Data":"6d3b66bf6eaf3e91950339acd55099332bc6bff0f92d98ec8f86ddce2c2da6b0"} Apr 20 22:26:00.411660 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.411198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"54ee3a71-33df-46b8-9cf2-3ba929fdd80b","Type":"ContainerStarted","Data":"8f7a086ec4d6bb9fd75fa71d470b57bc68db53869ebf169aa6eac73b4ac75ddd"} Apr 20 22:26:00.411660 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.411207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"54ee3a71-33df-46b8-9cf2-3ba929fdd80b","Type":"ContainerStarted","Data":"8ac6181271a58d0282ba5a55e7303bea3b3a074959769b8d3438e9dad0028b57"} Apr 20 22:26:00.411660 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.411217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"54ee3a71-33df-46b8-9cf2-3ba929fdd80b","Type":"ContainerStarted","Data":"f825db637f02498d66107aeb3a8e5b9d6253e504940242a1e7b2550230f1c161"} Apr 20 22:26:00.439279 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.439236 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.439220996 podStartE2EDuration="2.439220996s" podCreationTimestamp="2026-04-20 22:25:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:26:00.437344976 +0000 UTC m=+90.051501535" watchObservedRunningTime="2026-04-20 22:26:00.439220996 +0000 UTC m=+90.053377554" Apr 20 22:26:00.699242 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.699161 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f"] Apr 20 22:26:00.702574 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.702558 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.705133 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.705103 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-28kjt\"" Apr 20 22:26:00.705262 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.705152 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 22:26:00.705262 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.705170 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 22:26:00.705386 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.705274 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 22:26:00.705386 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.705339 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 22:26:00.705701 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.705666 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 22:26:00.709529 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.709508 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 22:26:00.717042 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.717017 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f"] Apr 20 22:26:00.732330 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.732304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-secret-telemeter-client\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.732609 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.732348 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b85a832-b4e7-438a-bc62-1d0c115d6467-metrics-client-ca\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.732609 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.732389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-telemeter-client-tls\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.732609 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.732423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-federate-client-tls\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.732609 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.732448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b85a832-b4e7-438a-bc62-1d0c115d6467-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.732609 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.732502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.732609 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.732527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b85a832-b4e7-438a-bc62-1d0c115d6467-serving-certs-ca-bundle\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.732609 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.732551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djv9t\" (UniqueName: \"kubernetes.io/projected/8b85a832-b4e7-438a-bc62-1d0c115d6467-kube-api-access-djv9t\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.833173 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.833137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-secret-telemeter-client\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.833173 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.833177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b85a832-b4e7-438a-bc62-1d0c115d6467-metrics-client-ca\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.833396 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.833205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-telemeter-client-tls\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.833396 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.833240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-federate-client-tls\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.833396 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.833267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b85a832-b4e7-438a-bc62-1d0c115d6467-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.833396 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.833324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.833396 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.833352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b85a832-b4e7-438a-bc62-1d0c115d6467-serving-certs-ca-bundle\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.833396 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.833375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djv9t\" (UniqueName: \"kubernetes.io/projected/8b85a832-b4e7-438a-bc62-1d0c115d6467-kube-api-access-djv9t\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.834088 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.833987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b85a832-b4e7-438a-bc62-1d0c115d6467-metrics-client-ca\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.834260 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.834230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b85a832-b4e7-438a-bc62-1d0c115d6467-serving-certs-ca-bundle\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.834434 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.834311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b85a832-b4e7-438a-bc62-1d0c115d6467-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.836062 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.836036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-telemeter-client-tls\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.836156 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.836044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.836156 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.836082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-federate-client-tls\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.836156 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.836113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8b85a832-b4e7-438a-bc62-1d0c115d6467-secret-telemeter-client\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:00.846211 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:00.846187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djv9t\" (UniqueName: \"kubernetes.io/projected/8b85a832-b4e7-438a-bc62-1d0c115d6467-kube-api-access-djv9t\") pod \"telemeter-client-85b7f58c6c-p5f2f\" (UID: \"8b85a832-b4e7-438a-bc62-1d0c115d6467\") " pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:01.012162 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:01.012066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" Apr 20 22:26:01.141325 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:01.141294 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f"] Apr 20 22:26:01.144815 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:26:01.144784 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b85a832_b4e7_438a_bc62_1d0c115d6467.slice/crio-ae906e9ec4d7402cbf6ed2c41c5feaedda04161623e154c6cc2b50b6af239e6c WatchSource:0}: Error finding container ae906e9ec4d7402cbf6ed2c41c5feaedda04161623e154c6cc2b50b6af239e6c: Status 404 returned error can't find the container with id ae906e9ec4d7402cbf6ed2c41c5feaedda04161623e154c6cc2b50b6af239e6c Apr 20 22:26:01.414608 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:01.414573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" event={"ID":"8b85a832-b4e7-438a-bc62-1d0c115d6467","Type":"ContainerStarted","Data":"ae906e9ec4d7402cbf6ed2c41c5feaedda04161623e154c6cc2b50b6af239e6c"} Apr 20 22:26:03.422833 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:03.422802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" event={"ID":"8b85a832-b4e7-438a-bc62-1d0c115d6467","Type":"ContainerStarted","Data":"f0e4efeda6f0fb00fb0d48e24851c0cb68aa27d12171a0d6e0497c38d9ff8543"} Apr 20 22:26:03.423134 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:03.422843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" event={"ID":"8b85a832-b4e7-438a-bc62-1d0c115d6467","Type":"ContainerStarted","Data":"d933bee6285598db7eb844cd84ed6d5e65f8bbee24f228b8063be14c178728fb"} Apr 20 22:26:04.427986 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:04.427948 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" event={"ID":"8b85a832-b4e7-438a-bc62-1d0c115d6467","Type":"ContainerStarted","Data":"82a1506aa58042943f256db8e312c2162e9dc745ccf3ec5abd1c4e36f731278f"} Apr 20 22:26:04.449712 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:04.449631 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-85b7f58c6c-p5f2f" podStartSLOduration=2.301524701 podStartE2EDuration="4.449611206s" podCreationTimestamp="2026-04-20 22:26:00 +0000 UTC" firstStartedPulling="2026-04-20 22:26:01.146465243 +0000 UTC m=+90.760621779" lastFinishedPulling="2026-04-20 22:26:03.294551731 +0000 UTC m=+92.908708284" observedRunningTime="2026-04-20 22:26:04.447599731 +0000 UTC m=+94.061756289" watchObservedRunningTime="2026-04-20 22:26:04.449611206 +0000 UTC m=+94.063767765" Apr 20 22:26:05.036151 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.036117 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59fbc94577-dsk6f"] Apr 20 22:26:05.039311 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.039293 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.049729 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.049701 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59fbc94577-dsk6f"] Apr 20 22:26:05.069355 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.069313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-serving-cert\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.069355 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.069354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-service-ca\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.069557 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.069385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-trusted-ca-bundle\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.069557 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.069508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-oauth-config\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.069557 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.069542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-console-config\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.069716 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.069581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqp8g\" (UniqueName: \"kubernetes.io/projected/1aca88d7-e560-47cf-9109-629a93da89b7-kube-api-access-qqp8g\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.069716 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.069609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-oauth-serving-cert\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.171111 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.171072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-oauth-config\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.171111 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.171112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-console-config\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.171308 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.171137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqp8g\" (UniqueName: \"kubernetes.io/projected/1aca88d7-e560-47cf-9109-629a93da89b7-kube-api-access-qqp8g\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.171351 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.171317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-oauth-serving-cert\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.171420 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.171409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-serving-cert\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.171465 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.171429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-service-ca\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.171465 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.171447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-trusted-ca-bundle\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.172494 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.172452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-oauth-serving-cert\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.172715 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.172480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-console-config\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.172823 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.172502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-service-ca\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.172930 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.172873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-trusted-ca-bundle\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.176968 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.176942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-serving-cert\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.177088 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.177066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-oauth-config\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.181414 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.181389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqp8g\" (UniqueName: \"kubernetes.io/projected/1aca88d7-e560-47cf-9109-629a93da89b7-kube-api-access-qqp8g\") pod \"console-59fbc94577-dsk6f\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.348354 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.348316 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:05.479696 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:05.479642 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59fbc94577-dsk6f"] Apr 20 22:26:05.483299 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:26:05.483271 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aca88d7_e560_47cf_9109_629a93da89b7.slice/crio-7dc09ef5d727ec5df81dae32e6d11a1a668a476bd5b710e06d19a19aecac1189 WatchSource:0}: Error finding container 7dc09ef5d727ec5df81dae32e6d11a1a668a476bd5b710e06d19a19aecac1189: Status 404 returned error can't find the container with id 7dc09ef5d727ec5df81dae32e6d11a1a668a476bd5b710e06d19a19aecac1189 Apr 20 22:26:06.435563 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:06.435529 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fbc94577-dsk6f" event={"ID":"1aca88d7-e560-47cf-9109-629a93da89b7","Type":"ContainerStarted","Data":"93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1"} Apr 20 22:26:06.435563 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:06.435567 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fbc94577-dsk6f" event={"ID":"1aca88d7-e560-47cf-9109-629a93da89b7","Type":"ContainerStarted","Data":"7dc09ef5d727ec5df81dae32e6d11a1a668a476bd5b710e06d19a19aecac1189"} Apr 20 22:26:06.464887 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:06.464832 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59fbc94577-dsk6f" podStartSLOduration=1.464815958 podStartE2EDuration="1.464815958s" podCreationTimestamp="2026-04-20 22:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:26:06.462728876 +0000 UTC m=+96.076885436" watchObservedRunningTime="2026-04-20 22:26:06.464815958 +0000 UTC m=+96.078972517" Apr 20 22:26:06.919031 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:06.918994 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/init-config-reloader/0.log" Apr 20 22:26:07.118586 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:07.118555 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/alertmanager/0.log" Apr 20 22:26:07.318238 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:07.318164 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/config-reloader/0.log" Apr 20 22:26:07.518500 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:07.518467 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/kube-rbac-proxy-web/0.log" Apr 20 22:26:07.718799 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:07.718766 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/kube-rbac-proxy/0.log" Apr 20 22:26:07.919138 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:07.919108 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/kube-rbac-proxy-metric/0.log" Apr 20 22:26:08.118352 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:08.118316 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/prom-label-proxy/0.log" Apr 20 22:26:10.717875 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:10.717839 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xhqjt_eaddf140-0247-4d4a-8283-7ad9403b4507/init-textfile/0.log" Apr 20 22:26:10.918652 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:10.918623 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xhqjt_eaddf140-0247-4d4a-8283-7ad9403b4507/node-exporter/0.log" Apr 20 22:26:11.120883 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:11.120846 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xhqjt_eaddf140-0247-4d4a-8283-7ad9403b4507/kube-rbac-proxy/0.log" Apr 20 22:26:12.349692 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:12.349640 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vjqq9" Apr 20 22:26:13.919348 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:13.919321 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85b7f58c6c-p5f2f_8b85a832-b4e7-438a-bc62-1d0c115d6467/telemeter-client/0.log" Apr 20 22:26:14.122283 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:14.122255 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85b7f58c6c-p5f2f_8b85a832-b4e7-438a-bc62-1d0c115d6467/reload/0.log" Apr 20 22:26:14.318529 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:14.318444 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85b7f58c6c-p5f2f_8b85a832-b4e7-438a-bc62-1d0c115d6467/kube-rbac-proxy/0.log" Apr 20 22:26:15.348844 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:15.348804 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:15.348844 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:15.348848 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:15.353695 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:15.353650 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:15.464274 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:15.464247 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:26:15.505647 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:15.505608 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fcf78567d-c4bs7"] Apr 20 22:26:16.318974 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:16.318945 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59fbc94577-dsk6f_1aca88d7-e560-47cf-9109-629a93da89b7/console/0.log" Apr 20 22:26:16.519034 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:16.519009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fcf78567d-c4bs7_0ac0422c-e979-48ff-90cc-ffc86f66b903/console/0.log" Apr 20 22:26:17.318695 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:17.318645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-87rj9_c3473a30-a4b9-4d21-9b2f-83594665ed99/serve-healthcheck-canary/0.log" Apr 20 22:26:40.525751 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.525691 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6fcf78567d-c4bs7" podUID="0ac0422c-e979-48ff-90cc-ffc86f66b903" containerName="console" containerID="cri-o://c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805" gracePeriod=15 Apr 20 22:26:40.761993 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.761962 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fcf78567d-c4bs7_0ac0422c-e979-48ff-90cc-ffc86f66b903/console/0.log" Apr 20 22:26:40.762126 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.762025 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:26:40.855399 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.855368 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-oauth-serving-cert\") pod \"0ac0422c-e979-48ff-90cc-ffc86f66b903\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " Apr 20 22:26:40.855399 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.855414 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-oauth-config\") pod \"0ac0422c-e979-48ff-90cc-ffc86f66b903\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " Apr 20 22:26:40.855632 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.855457 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-serving-cert\") pod \"0ac0422c-e979-48ff-90cc-ffc86f66b903\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " Apr 20 22:26:40.855632 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.855492 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhtbl\" (UniqueName: \"kubernetes.io/projected/0ac0422c-e979-48ff-90cc-ffc86f66b903-kube-api-access-lhtbl\") pod \"0ac0422c-e979-48ff-90cc-ffc86f66b903\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " Apr 20 22:26:40.855632 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.855511 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-trusted-ca-bundle\") pod \"0ac0422c-e979-48ff-90cc-ffc86f66b903\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " Apr 20 22:26:40.855632 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.855530 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-config\") pod \"0ac0422c-e979-48ff-90cc-ffc86f66b903\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " Apr 20 22:26:40.855632 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.855548 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-service-ca\") pod \"0ac0422c-e979-48ff-90cc-ffc86f66b903\" (UID: \"0ac0422c-e979-48ff-90cc-ffc86f66b903\") " Apr 20 22:26:40.855928 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.855897 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0ac0422c-e979-48ff-90cc-ffc86f66b903" (UID: "0ac0422c-e979-48ff-90cc-ffc86f66b903"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:26:40.856030 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.856003 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0ac0422c-e979-48ff-90cc-ffc86f66b903" (UID: "0ac0422c-e979-48ff-90cc-ffc86f66b903"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:26:40.856030 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.856017 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-service-ca" (OuterVolumeSpecName: "service-ca") pod "0ac0422c-e979-48ff-90cc-ffc86f66b903" (UID: "0ac0422c-e979-48ff-90cc-ffc86f66b903"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:26:40.856110 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.856030 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-config" (OuterVolumeSpecName: "console-config") pod "0ac0422c-e979-48ff-90cc-ffc86f66b903" (UID: "0ac0422c-e979-48ff-90cc-ffc86f66b903"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:26:40.857934 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.857906 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0ac0422c-e979-48ff-90cc-ffc86f66b903" (UID: "0ac0422c-e979-48ff-90cc-ffc86f66b903"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:40.858046 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.857965 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0ac0422c-e979-48ff-90cc-ffc86f66b903" (UID: "0ac0422c-e979-48ff-90cc-ffc86f66b903"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:40.858046 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.857970 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac0422c-e979-48ff-90cc-ffc86f66b903-kube-api-access-lhtbl" (OuterVolumeSpecName: "kube-api-access-lhtbl") pod "0ac0422c-e979-48ff-90cc-ffc86f66b903" (UID: "0ac0422c-e979-48ff-90cc-ffc86f66b903"). InnerVolumeSpecName "kube-api-access-lhtbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:26:40.956107 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.956066 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lhtbl\" (UniqueName: \"kubernetes.io/projected/0ac0422c-e979-48ff-90cc-ffc86f66b903-kube-api-access-lhtbl\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:26:40.956107 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.956098 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-trusted-ca-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:26:40.956107 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.956110 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:26:40.956107 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.956119 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-service-ca\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:26:40.956385 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.956127 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac0422c-e979-48ff-90cc-ffc86f66b903-oauth-serving-cert\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:26:40.956385 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.956136 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-oauth-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:26:40.956385 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:40.956146 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac0422c-e979-48ff-90cc-ffc86f66b903-console-serving-cert\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:26:41.530432 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.530398 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fcf78567d-c4bs7_0ac0422c-e979-48ff-90cc-ffc86f66b903/console/0.log" Apr 20 22:26:41.530432 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.530437 2575 generic.go:358] "Generic (PLEG): container finished" podID="0ac0422c-e979-48ff-90cc-ffc86f66b903" containerID="c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805" exitCode=2 Apr 20 22:26:41.530996 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.530470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fcf78567d-c4bs7" event={"ID":"0ac0422c-e979-48ff-90cc-ffc86f66b903","Type":"ContainerDied","Data":"c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805"} Apr 20 22:26:41.530996 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.530507 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fcf78567d-c4bs7" event={"ID":"0ac0422c-e979-48ff-90cc-ffc86f66b903","Type":"ContainerDied","Data":"da4bd1808d0ec33015392f39badcb01d24856ed83cc8b1fc7c956324da27d83b"} Apr 20 22:26:41.530996 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.530522 2575 scope.go:117] "RemoveContainer" containerID="c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805" Apr 20 22:26:41.530996 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.530525 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fcf78567d-c4bs7" Apr 20 22:26:41.538106 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.538087 2575 scope.go:117] "RemoveContainer" containerID="c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805" Apr 20 22:26:41.538415 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:26:41.538394 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805\": container with ID starting with c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805 not found: ID does not exist" containerID="c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805" Apr 20 22:26:41.538469 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.538426 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805"} err="failed to get container status \"c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805\": rpc error: code = NotFound desc = could not find container \"c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805\": container with ID starting with c38112bba12969f4eadc2c16ad912f341446ce40fb5f1f4839f3c8e3bc86b805 not found: ID does not exist" Apr 20 22:26:41.549364 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.549327 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fcf78567d-c4bs7"] Apr 20 22:26:41.553005 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:41.552973 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6fcf78567d-c4bs7"] Apr 20 22:26:43.013051 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:26:43.013012 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac0422c-e979-48ff-90cc-ffc86f66b903" path="/var/lib/kubelet/pods/0ac0422c-e979-48ff-90cc-ffc86f66b903/volumes" Apr 20 22:27:54.286235 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.286198 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6774bd9776-8g8db"] Apr 20 22:27:54.286753 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.286470 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ac0422c-e979-48ff-90cc-ffc86f66b903" containerName="console" Apr 20 22:27:54.286753 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.286482 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac0422c-e979-48ff-90cc-ffc86f66b903" containerName="console" Apr 20 22:27:54.286753 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.286531 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ac0422c-e979-48ff-90cc-ffc86f66b903" containerName="console" Apr 20 22:27:54.289523 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.289498 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.299268 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.299241 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6774bd9776-8g8db"] Apr 20 22:27:54.395752 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.395714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-trusted-ca-bundle\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.395752 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.395757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-console-config\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.395985 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.395781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-oauth-serving-cert\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.395985 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.395800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-oauth-config\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.395985 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.395863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8526\" (UniqueName: \"kubernetes.io/projected/42e6d741-1c79-4873-b919-bdd854703c6f-kube-api-access-s8526\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.395985 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.395904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-service-ca\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.395985 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.395936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-serving-cert\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.496754 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.496715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-console-config\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.496889 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.496767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-oauth-serving-cert\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.496889 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.496795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-oauth-config\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.496889 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.496822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8526\" (UniqueName: \"kubernetes.io/projected/42e6d741-1c79-4873-b919-bdd854703c6f-kube-api-access-s8526\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.496889 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.496851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-service-ca\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.497076 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.496895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-serving-cert\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.497076 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.496953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-trusted-ca-bundle\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.497716 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.497601 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-service-ca\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.497716 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.497601 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-oauth-serving-cert\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.497716 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.497648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-console-config\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.498015 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.497868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-trusted-ca-bundle\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.499389 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.499366 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-oauth-config\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.499500 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.499482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-serving-cert\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.504967 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.504946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8526\" (UniqueName: \"kubernetes.io/projected/42e6d741-1c79-4873-b919-bdd854703c6f-kube-api-access-s8526\") pod \"console-6774bd9776-8g8db\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.599124 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.599096 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:27:54.727376 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:54.727261 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6774bd9776-8g8db"] Apr 20 22:27:54.729994 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:27:54.729967 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e6d741_1c79_4873_b919_bdd854703c6f.slice/crio-3ade62012559d6a4b4e9d2287e33a71402f299d4b08daf1de1e7bd0c5ad24780 WatchSource:0}: Error finding container 3ade62012559d6a4b4e9d2287e33a71402f299d4b08daf1de1e7bd0c5ad24780: Status 404 returned error can't find the container with id 3ade62012559d6a4b4e9d2287e33a71402f299d4b08daf1de1e7bd0c5ad24780 Apr 20 22:27:55.728196 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:55.728154 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6774bd9776-8g8db" event={"ID":"42e6d741-1c79-4873-b919-bdd854703c6f","Type":"ContainerStarted","Data":"b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b"} Apr 20 22:27:55.728196 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:55.728197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6774bd9776-8g8db" event={"ID":"42e6d741-1c79-4873-b919-bdd854703c6f","Type":"ContainerStarted","Data":"3ade62012559d6a4b4e9d2287e33a71402f299d4b08daf1de1e7bd0c5ad24780"} Apr 20 22:27:55.746952 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:27:55.746890 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6774bd9776-8g8db" podStartSLOduration=1.74687314 podStartE2EDuration="1.74687314s" podCreationTimestamp="2026-04-20 22:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:27:55.745779152 +0000 UTC m=+205.359935711" watchObservedRunningTime="2026-04-20 22:27:55.74687314 +0000 UTC m=+205.361029695" Apr 20 22:28:04.599321 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:04.599285 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:28:04.599321 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:04.599328 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:28:04.603880 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:04.603858 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:28:04.755126 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:04.755100 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:28:04.798453 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:04.798419 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59fbc94577-dsk6f"] Apr 20 22:28:24.095553 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.095513 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl"] Apr 20 22:28:24.098743 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.098723 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.101239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.101209 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 22:28:24.102288 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.102268 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4psch\"" Apr 20 22:28:24.102344 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.102290 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 22:28:24.107313 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.107283 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl"] Apr 20 22:28:24.232916 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.232882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.233138 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.232953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9lh\" (UniqueName: \"kubernetes.io/projected/fb725401-c518-4635-a94b-4a25f83a989c-kube-api-access-gm9lh\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.233138 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.232990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.333947 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.333897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.334156 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.333981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.334156 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.334020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9lh\" (UniqueName: \"kubernetes.io/projected/fb725401-c518-4635-a94b-4a25f83a989c-kube-api-access-gm9lh\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.334315 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.334295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.334384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.334346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.342524 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.342482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9lh\" (UniqueName: \"kubernetes.io/projected/fb725401-c518-4635-a94b-4a25f83a989c-kube-api-access-gm9lh\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.408515 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.408471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:24.535358 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.535314 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl"] Apr 20 22:28:24.541313 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:28:24.541276 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb725401_c518_4635_a94b_4a25f83a989c.slice/crio-0de90107a075fcb105e275a55e35edb8217da03a429a13aa75b3a9fabf8df35a WatchSource:0}: Error finding container 0de90107a075fcb105e275a55e35edb8217da03a429a13aa75b3a9fabf8df35a: Status 404 returned error can't find the container with id 0de90107a075fcb105e275a55e35edb8217da03a429a13aa75b3a9fabf8df35a Apr 20 22:28:24.805320 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:24.805226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" event={"ID":"fb725401-c518-4635-a94b-4a25f83a989c","Type":"ContainerStarted","Data":"0de90107a075fcb105e275a55e35edb8217da03a429a13aa75b3a9fabf8df35a"} Apr 20 22:28:29.821469 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:29.821429 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59fbc94577-dsk6f" podUID="1aca88d7-e560-47cf-9109-629a93da89b7" containerName="console" containerID="cri-o://93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1" gracePeriod=15 Apr 20 22:28:31.189933 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.189908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59fbc94577-dsk6f_1aca88d7-e560-47cf-9109-629a93da89b7/console/0.log" Apr 20 22:28:31.190303 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.189971 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:28:31.296874 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.296780 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-oauth-config\") pod \"1aca88d7-e560-47cf-9109-629a93da89b7\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " Apr 20 22:28:31.296874 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.296833 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-console-config\") pod \"1aca88d7-e560-47cf-9109-629a93da89b7\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " Apr 20 22:28:31.296874 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.296858 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqp8g\" (UniqueName: \"kubernetes.io/projected/1aca88d7-e560-47cf-9109-629a93da89b7-kube-api-access-qqp8g\") pod \"1aca88d7-e560-47cf-9109-629a93da89b7\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " Apr 20 22:28:31.297159 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.296913 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-serving-cert\") pod \"1aca88d7-e560-47cf-9109-629a93da89b7\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " Apr 20 22:28:31.297159 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.296929 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-service-ca\") pod \"1aca88d7-e560-47cf-9109-629a93da89b7\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " Apr 20 22:28:31.297159 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.296966 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-oauth-serving-cert\") pod \"1aca88d7-e560-47cf-9109-629a93da89b7\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " Apr 20 22:28:31.297159 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.297015 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-trusted-ca-bundle\") pod \"1aca88d7-e560-47cf-9109-629a93da89b7\" (UID: \"1aca88d7-e560-47cf-9109-629a93da89b7\") " Apr 20 22:28:31.297436 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.297403 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-console-config" (OuterVolumeSpecName: "console-config") pod "1aca88d7-e560-47cf-9109-629a93da89b7" (UID: "1aca88d7-e560-47cf-9109-629a93da89b7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:28:31.297585 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.297447 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-service-ca" (OuterVolumeSpecName: "service-ca") pod "1aca88d7-e560-47cf-9109-629a93da89b7" (UID: "1aca88d7-e560-47cf-9109-629a93da89b7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:28:31.297585 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.297523 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1aca88d7-e560-47cf-9109-629a93da89b7" (UID: "1aca88d7-e560-47cf-9109-629a93da89b7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:28:31.297585 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.297562 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1aca88d7-e560-47cf-9109-629a93da89b7" (UID: "1aca88d7-e560-47cf-9109-629a93da89b7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:28:31.299125 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.299104 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1aca88d7-e560-47cf-9109-629a93da89b7" (UID: "1aca88d7-e560-47cf-9109-629a93da89b7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:28:31.299189 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.299158 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1aca88d7-e560-47cf-9109-629a93da89b7" (UID: "1aca88d7-e560-47cf-9109-629a93da89b7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:28:31.299227 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.299182 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aca88d7-e560-47cf-9109-629a93da89b7-kube-api-access-qqp8g" (OuterVolumeSpecName: "kube-api-access-qqp8g") pod "1aca88d7-e560-47cf-9109-629a93da89b7" (UID: "1aca88d7-e560-47cf-9109-629a93da89b7"). InnerVolumeSpecName "kube-api-access-qqp8g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:28:31.397591 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.397553 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-serving-cert\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:31.397591 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.397585 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-service-ca\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:31.397591 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.397596 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-oauth-serving-cert\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:31.397814 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.397605 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-trusted-ca-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:31.397814 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.397614 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aca88d7-e560-47cf-9109-629a93da89b7-console-oauth-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:31.397814 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.397623 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aca88d7-e560-47cf-9109-629a93da89b7-console-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:31.397814 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.397632 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqp8g\" (UniqueName: \"kubernetes.io/projected/1aca88d7-e560-47cf-9109-629a93da89b7-kube-api-access-qqp8g\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:31.826590 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.826563 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59fbc94577-dsk6f_1aca88d7-e560-47cf-9109-629a93da89b7/console/0.log" Apr 20 22:28:31.826807 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.826604 2575 generic.go:358] "Generic (PLEG): container finished" podID="1aca88d7-e560-47cf-9109-629a93da89b7" containerID="93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1" exitCode=2 Apr 20 22:28:31.826807 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.826632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fbc94577-dsk6f" event={"ID":"1aca88d7-e560-47cf-9109-629a93da89b7","Type":"ContainerDied","Data":"93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1"} Apr 20 22:28:31.826807 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.826682 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fbc94577-dsk6f" event={"ID":"1aca88d7-e560-47cf-9109-629a93da89b7","Type":"ContainerDied","Data":"7dc09ef5d727ec5df81dae32e6d11a1a668a476bd5b710e06d19a19aecac1189"} Apr 20 22:28:31.826807 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.826698 2575 scope.go:117] "RemoveContainer" containerID="93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1" Apr 20 22:28:31.826807 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.826702 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fbc94577-dsk6f" Apr 20 22:28:31.828305 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.828278 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb725401-c518-4635-a94b-4a25f83a989c" containerID="83fbb9bafb363792d343cbaccaaa07771280834584d659a244cd886731107b21" exitCode=0 Apr 20 22:28:31.828421 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.828339 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" event={"ID":"fb725401-c518-4635-a94b-4a25f83a989c","Type":"ContainerDied","Data":"83fbb9bafb363792d343cbaccaaa07771280834584d659a244cd886731107b21"} Apr 20 22:28:31.835173 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.835157 2575 scope.go:117] "RemoveContainer" containerID="93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1" Apr 20 22:28:31.835410 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:28:31.835390 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1\": container with ID starting with 93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1 not found: ID does not exist" containerID="93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1" Apr 20 22:28:31.835489 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.835419 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1"} err="failed to get container status \"93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1\": rpc error: code = NotFound desc = could not find container \"93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1\": container with ID starting with 93cdd66196dbe508cedab79de3d211fbb46f1aa5e25bdc9dc13456cbfa2348d1 not found: ID does not exist" Apr 20 22:28:31.860251 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.860219 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59fbc94577-dsk6f"] Apr 20 22:28:31.863158 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:31.863132 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59fbc94577-dsk6f"] Apr 20 22:28:33.013590 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:33.013556 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aca88d7-e560-47cf-9109-629a93da89b7" path="/var/lib/kubelet/pods/1aca88d7-e560-47cf-9109-629a93da89b7/volumes" Apr 20 22:28:34.840805 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:34.840762 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb725401-c518-4635-a94b-4a25f83a989c" containerID="640e70d4028b66d32efd771f9e1d12a4240c33c2c5ba19650eb346823d0e572b" exitCode=0 Apr 20 22:28:34.841248 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:34.840828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" event={"ID":"fb725401-c518-4635-a94b-4a25f83a989c","Type":"ContainerDied","Data":"640e70d4028b66d32efd771f9e1d12a4240c33c2c5ba19650eb346823d0e572b"} Apr 20 22:28:44.869348 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:44.869307 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb725401-c518-4635-a94b-4a25f83a989c" containerID="f93bc9043d184cd3da03590a86e9fc66b75edfe9d4701deb059d747503566542" exitCode=0 Apr 20 22:28:44.869767 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:44.869380 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" event={"ID":"fb725401-c518-4635-a94b-4a25f83a989c","Type":"ContainerDied","Data":"f93bc9043d184cd3da03590a86e9fc66b75edfe9d4701deb059d747503566542"} Apr 20 22:28:45.998890 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:45.998837 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:46.117601 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.117564 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm9lh\" (UniqueName: \"kubernetes.io/projected/fb725401-c518-4635-a94b-4a25f83a989c-kube-api-access-gm9lh\") pod \"fb725401-c518-4635-a94b-4a25f83a989c\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " Apr 20 22:28:46.117807 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.117702 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-util\") pod \"fb725401-c518-4635-a94b-4a25f83a989c\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " Apr 20 22:28:46.117807 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.117764 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-bundle\") pod \"fb725401-c518-4635-a94b-4a25f83a989c\" (UID: \"fb725401-c518-4635-a94b-4a25f83a989c\") " Apr 20 22:28:46.118515 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.118484 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-bundle" (OuterVolumeSpecName: "bundle") pod "fb725401-c518-4635-a94b-4a25f83a989c" (UID: "fb725401-c518-4635-a94b-4a25f83a989c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:28:46.119984 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.119960 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb725401-c518-4635-a94b-4a25f83a989c-kube-api-access-gm9lh" (OuterVolumeSpecName: "kube-api-access-gm9lh") pod "fb725401-c518-4635-a94b-4a25f83a989c" (UID: "fb725401-c518-4635-a94b-4a25f83a989c"). InnerVolumeSpecName "kube-api-access-gm9lh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:28:46.122189 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.122145 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-util" (OuterVolumeSpecName: "util") pod "fb725401-c518-4635-a94b-4a25f83a989c" (UID: "fb725401-c518-4635-a94b-4a25f83a989c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:28:46.218262 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.218223 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gm9lh\" (UniqueName: \"kubernetes.io/projected/fb725401-c518-4635-a94b-4a25f83a989c-kube-api-access-gm9lh\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:46.218262 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.218254 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:46.218262 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.218265 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb725401-c518-4635-a94b-4a25f83a989c-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:28:46.875964 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.875936 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" Apr 20 22:28:46.876137 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.875926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvp4nl" event={"ID":"fb725401-c518-4635-a94b-4a25f83a989c","Type":"ContainerDied","Data":"0de90107a075fcb105e275a55e35edb8217da03a429a13aa75b3a9fabf8df35a"} Apr 20 22:28:46.876137 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:46.876046 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de90107a075fcb105e275a55e35edb8217da03a429a13aa75b3a9fabf8df35a" Apr 20 22:28:52.056063 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056027 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl"] Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056276 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb725401-c518-4635-a94b-4a25f83a989c" containerName="extract" Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056286 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb725401-c518-4635-a94b-4a25f83a989c" containerName="extract" Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056302 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1aca88d7-e560-47cf-9109-629a93da89b7" containerName="console" Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056308 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aca88d7-e560-47cf-9109-629a93da89b7" containerName="console" Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056317 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb725401-c518-4635-a94b-4a25f83a989c" containerName="util" Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056322 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb725401-c518-4635-a94b-4a25f83a989c" containerName="util" Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056329 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb725401-c518-4635-a94b-4a25f83a989c" containerName="pull" Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056334 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb725401-c518-4635-a94b-4a25f83a989c" containerName="pull" Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056369 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb725401-c518-4635-a94b-4a25f83a989c" containerName="extract" Apr 20 22:28:52.056535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.056377 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1aca88d7-e560-47cf-9109-629a93da89b7" containerName="console" Apr 20 22:28:52.058050 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.058034 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" Apr 20 22:28:52.060793 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.060765 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 22:28:52.060793 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.060779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:28:52.060974 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.060816 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-2l8sq\"" Apr 20 22:28:52.071563 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.071537 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl"] Apr 20 22:28:52.164085 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.164042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c36f2a0-9bc1-45d6-a78b-d7075aa038c1-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-w8zpl\" (UID: \"7c36f2a0-9bc1-45d6-a78b-d7075aa038c1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" Apr 20 22:28:52.164260 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.164150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr26n\" (UniqueName: \"kubernetes.io/projected/7c36f2a0-9bc1-45d6-a78b-d7075aa038c1-kube-api-access-pr26n\") pod \"cert-manager-operator-controller-manager-54b9655956-w8zpl\" (UID: \"7c36f2a0-9bc1-45d6-a78b-d7075aa038c1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" Apr 20 22:28:52.265191 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.265156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pr26n\" (UniqueName: \"kubernetes.io/projected/7c36f2a0-9bc1-45d6-a78b-d7075aa038c1-kube-api-access-pr26n\") pod \"cert-manager-operator-controller-manager-54b9655956-w8zpl\" (UID: \"7c36f2a0-9bc1-45d6-a78b-d7075aa038c1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" Apr 20 22:28:52.265368 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.265208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c36f2a0-9bc1-45d6-a78b-d7075aa038c1-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-w8zpl\" (UID: \"7c36f2a0-9bc1-45d6-a78b-d7075aa038c1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" Apr 20 22:28:52.265551 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.265536 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c36f2a0-9bc1-45d6-a78b-d7075aa038c1-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-w8zpl\" (UID: \"7c36f2a0-9bc1-45d6-a78b-d7075aa038c1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" Apr 20 22:28:52.275058 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.275029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr26n\" (UniqueName: \"kubernetes.io/projected/7c36f2a0-9bc1-45d6-a78b-d7075aa038c1-kube-api-access-pr26n\") pod \"cert-manager-operator-controller-manager-54b9655956-w8zpl\" (UID: \"7c36f2a0-9bc1-45d6-a78b-d7075aa038c1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" Apr 20 22:28:52.366833 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.366802 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" Apr 20 22:28:52.491415 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.491392 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl"] Apr 20 22:28:52.494263 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:28:52.494228 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c36f2a0_9bc1_45d6_a78b_d7075aa038c1.slice/crio-f93431f30c848acd8f1532bb7f8706e2f18c129efc5d7714ab8403daf9b64d10 WatchSource:0}: Error finding container f93431f30c848acd8f1532bb7f8706e2f18c129efc5d7714ab8403daf9b64d10: Status 404 returned error can't find the container with id f93431f30c848acd8f1532bb7f8706e2f18c129efc5d7714ab8403daf9b64d10 Apr 20 22:28:52.892742 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:52.892706 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" event={"ID":"7c36f2a0-9bc1-45d6-a78b-d7075aa038c1","Type":"ContainerStarted","Data":"f93431f30c848acd8f1532bb7f8706e2f18c129efc5d7714ab8403daf9b64d10"} Apr 20 22:28:54.901360 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:54.901318 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" event={"ID":"7c36f2a0-9bc1-45d6-a78b-d7075aa038c1","Type":"ContainerStarted","Data":"49771ff4825cfe6fe8f5663ff567cebcca86f7cc059ffcfddb13a0c9476eef15"} Apr 20 22:28:54.921659 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:54.921600 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-w8zpl" podStartSLOduration=1.465438817 podStartE2EDuration="2.921584709s" podCreationTimestamp="2026-04-20 22:28:52 +0000 UTC" firstStartedPulling="2026-04-20 22:28:52.496733666 +0000 UTC m=+262.110890203" lastFinishedPulling="2026-04-20 22:28:53.952879559 +0000 UTC m=+263.567036095" observedRunningTime="2026-04-20 22:28:54.919034201 +0000 UTC m=+264.533190760" watchObservedRunningTime="2026-04-20 22:28:54.921584709 +0000 UTC m=+264.535741268" Apr 20 22:28:56.145757 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.145722 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49"] Apr 20 22:28:56.148029 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.148011 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.151233 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.151210 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 22:28:56.152454 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.152433 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4psch\"" Apr 20 22:28:56.152514 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.152485 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 22:28:56.157867 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.157833 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49"] Apr 20 22:28:56.198789 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.198755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.198976 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.198808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.198976 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.198861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fgbq\" (UniqueName: \"kubernetes.io/projected/cd73bcf6-1d4a-4143-a103-46121f100a43-kube-api-access-2fgbq\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.300007 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.299965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.300007 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.300004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fgbq\" (UniqueName: \"kubernetes.io/projected/cd73bcf6-1d4a-4143-a103-46121f100a43-kube-api-access-2fgbq\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.300219 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.300054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.300386 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.300365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.300424 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.300390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.308227 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.308201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fgbq\" (UniqueName: \"kubernetes.io/projected/cd73bcf6-1d4a-4143-a103-46121f100a43-kube-api-access-2fgbq\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.458394 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.458285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:28:56.586298 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.586263 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49"] Apr 20 22:28:56.589199 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:28:56.589167 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd73bcf6_1d4a_4143_a103_46121f100a43.slice/crio-ba72f61a9d6de37c0c58e971376f6391450da189767515df0e952ebc8642cf47 WatchSource:0}: Error finding container ba72f61a9d6de37c0c58e971376f6391450da189767515df0e952ebc8642cf47: Status 404 returned error can't find the container with id ba72f61a9d6de37c0c58e971376f6391450da189767515df0e952ebc8642cf47 Apr 20 22:28:56.908476 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.908441 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerID="e15f9de635db3479e2683de59bc86a07682bb0c1327cc19181f64b2390d4fb67" exitCode=0 Apr 20 22:28:56.908641 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.908481 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" event={"ID":"cd73bcf6-1d4a-4143-a103-46121f100a43","Type":"ContainerDied","Data":"e15f9de635db3479e2683de59bc86a07682bb0c1327cc19181f64b2390d4fb67"} Apr 20 22:28:56.908641 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:56.908507 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" event={"ID":"cd73bcf6-1d4a-4143-a103-46121f100a43","Type":"ContainerStarted","Data":"ba72f61a9d6de37c0c58e971376f6391450da189767515df0e952ebc8642cf47"} Apr 20 22:28:57.449805 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.449752 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-nxnw5"] Apr 20 22:28:57.452224 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.452190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:28:57.454847 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.454821 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 22:28:57.455009 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.454996 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 22:28:57.455081 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.455036 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-tr9fg\"" Apr 20 22:28:57.462423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.462390 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-nxnw5"] Apr 20 22:28:57.511516 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.511471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkhnj\" (UniqueName: \"kubernetes.io/projected/8e36e2c0-2caa-4642-97ee-099faea4ba6c-kube-api-access-fkhnj\") pod \"cert-manager-webhook-587ccfb98-nxnw5\" (UID: \"8e36e2c0-2caa-4642-97ee-099faea4ba6c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:28:57.511721 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.511589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e36e2c0-2caa-4642-97ee-099faea4ba6c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-nxnw5\" (UID: \"8e36e2c0-2caa-4642-97ee-099faea4ba6c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:28:57.612503 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.612468 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkhnj\" (UniqueName: \"kubernetes.io/projected/8e36e2c0-2caa-4642-97ee-099faea4ba6c-kube-api-access-fkhnj\") pod \"cert-manager-webhook-587ccfb98-nxnw5\" (UID: \"8e36e2c0-2caa-4642-97ee-099faea4ba6c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:28:57.612709 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.612523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e36e2c0-2caa-4642-97ee-099faea4ba6c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-nxnw5\" (UID: \"8e36e2c0-2caa-4642-97ee-099faea4ba6c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:28:57.621829 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.621797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e36e2c0-2caa-4642-97ee-099faea4ba6c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-nxnw5\" (UID: \"8e36e2c0-2caa-4642-97ee-099faea4ba6c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:28:57.621993 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.621852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkhnj\" (UniqueName: \"kubernetes.io/projected/8e36e2c0-2caa-4642-97ee-099faea4ba6c-kube-api-access-fkhnj\") pod \"cert-manager-webhook-587ccfb98-nxnw5\" (UID: \"8e36e2c0-2caa-4642-97ee-099faea4ba6c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:28:57.771639 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.771597 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:28:57.921103 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:57.921060 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-nxnw5"] Apr 20 22:28:57.926823 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:28:57.926792 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e36e2c0_2caa_4642_97ee_099faea4ba6c.slice/crio-ae711ea7b910d876483cc72c9476c551decc0286af6595fd935208be84f131c2 WatchSource:0}: Error finding container ae711ea7b910d876483cc72c9476c551decc0286af6595fd935208be84f131c2: Status 404 returned error can't find the container with id ae711ea7b910d876483cc72c9476c551decc0286af6595fd935208be84f131c2 Apr 20 22:28:58.916979 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:58.916940 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerID="cc7e1f72e12a4368d1603deb3f04329752213ee0494a1389819a1fd1e3f678d5" exitCode=0 Apr 20 22:28:58.917451 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:58.917026 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" event={"ID":"cd73bcf6-1d4a-4143-a103-46121f100a43","Type":"ContainerDied","Data":"cc7e1f72e12a4368d1603deb3f04329752213ee0494a1389819a1fd1e3f678d5"} Apr 20 22:28:58.918537 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:58.918510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" event={"ID":"8e36e2c0-2caa-4642-97ee-099faea4ba6c","Type":"ContainerStarted","Data":"ae711ea7b910d876483cc72c9476c551decc0286af6595fd935208be84f131c2"} Apr 20 22:28:59.925900 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:59.925854 2575 generic.go:358] "Generic (PLEG): container finished" podID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerID="2b9a012e126530c0951a0a0aaa237cf3123590eb7fa59dc2887649b2e2294239" exitCode=0 Apr 20 22:28:59.925900 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:59.925907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" event={"ID":"cd73bcf6-1d4a-4143-a103-46121f100a43","Type":"ContainerDied","Data":"2b9a012e126530c0951a0a0aaa237cf3123590eb7fa59dc2887649b2e2294239"} Apr 20 22:28:59.984951 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:59.984913 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-n4hvn"] Apr 20 22:28:59.987249 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:59.987220 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" Apr 20 22:28:59.990126 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:59.990099 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-4l44d\"" Apr 20 22:28:59.998386 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:28:59.998340 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-n4hvn"] Apr 20 22:29:00.035911 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.035873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c97fcd5-27d9-4778-9291-dc98ca4e6215-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-n4hvn\" (UID: \"8c97fcd5-27d9-4778-9291-dc98ca4e6215\") " pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" Apr 20 22:29:00.036111 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.036012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p92vq\" (UniqueName: \"kubernetes.io/projected/8c97fcd5-27d9-4778-9291-dc98ca4e6215-kube-api-access-p92vq\") pod \"cert-manager-cainjector-68b757865b-n4hvn\" (UID: \"8c97fcd5-27d9-4778-9291-dc98ca4e6215\") " pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" Apr 20 22:29:00.136809 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.136771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p92vq\" (UniqueName: \"kubernetes.io/projected/8c97fcd5-27d9-4778-9291-dc98ca4e6215-kube-api-access-p92vq\") pod \"cert-manager-cainjector-68b757865b-n4hvn\" (UID: \"8c97fcd5-27d9-4778-9291-dc98ca4e6215\") " pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" Apr 20 22:29:00.136809 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.136827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c97fcd5-27d9-4778-9291-dc98ca4e6215-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-n4hvn\" (UID: \"8c97fcd5-27d9-4778-9291-dc98ca4e6215\") " pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" Apr 20 22:29:00.145834 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.145800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c97fcd5-27d9-4778-9291-dc98ca4e6215-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-n4hvn\" (UID: \"8c97fcd5-27d9-4778-9291-dc98ca4e6215\") " pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" Apr 20 22:29:00.145999 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.145924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p92vq\" (UniqueName: \"kubernetes.io/projected/8c97fcd5-27d9-4778-9291-dc98ca4e6215-kube-api-access-p92vq\") pod \"cert-manager-cainjector-68b757865b-n4hvn\" (UID: \"8c97fcd5-27d9-4778-9291-dc98ca4e6215\") " pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" Apr 20 22:29:00.298866 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.298771 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" Apr 20 22:29:00.670922 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.670888 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-n4hvn"] Apr 20 22:29:00.673865 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:29:00.673833 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c97fcd5_27d9_4778_9291_dc98ca4e6215.slice/crio-f758d370d4f899f1bd76cc344a6b8559cc8b418edd54c3d1ba02b75c788d73cb WatchSource:0}: Error finding container f758d370d4f899f1bd76cc344a6b8559cc8b418edd54c3d1ba02b75c788d73cb: Status 404 returned error can't find the container with id f758d370d4f899f1bd76cc344a6b8559cc8b418edd54c3d1ba02b75c788d73cb Apr 20 22:29:00.930478 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.930436 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" event={"ID":"8e36e2c0-2caa-4642-97ee-099faea4ba6c","Type":"ContainerStarted","Data":"b05f0daa4299be6bf12e9e5c87c5fd865d02042c98a3efcbb95b3d18fe821d0c"} Apr 20 22:29:00.930968 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.930550 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:29:00.932053 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.932025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" event={"ID":"8c97fcd5-27d9-4778-9291-dc98ca4e6215","Type":"ContainerStarted","Data":"74fd7a52c90adbd80f63e618f43b8f7596e3ae543cd5faeaf8b35f3470fb6e53"} Apr 20 22:29:00.932160 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.932061 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" event={"ID":"8c97fcd5-27d9-4778-9291-dc98ca4e6215","Type":"ContainerStarted","Data":"f758d370d4f899f1bd76cc344a6b8559cc8b418edd54c3d1ba02b75c788d73cb"} Apr 20 22:29:00.949101 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.949041 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" podStartSLOduration=1.324811984 podStartE2EDuration="3.94902352s" podCreationTimestamp="2026-04-20 22:28:57 +0000 UTC" firstStartedPulling="2026-04-20 22:28:57.928931694 +0000 UTC m=+267.543088245" lastFinishedPulling="2026-04-20 22:29:00.553143245 +0000 UTC m=+270.167299781" observedRunningTime="2026-04-20 22:29:00.947956536 +0000 UTC m=+270.562113095" watchObservedRunningTime="2026-04-20 22:29:00.94902352 +0000 UTC m=+270.563180079" Apr 20 22:29:00.963981 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:00.963930 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-n4hvn" podStartSLOduration=1.963915615 podStartE2EDuration="1.963915615s" podCreationTimestamp="2026-04-20 22:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:29:00.962421593 +0000 UTC m=+270.576578152" watchObservedRunningTime="2026-04-20 22:29:00.963915615 +0000 UTC m=+270.578072173" Apr 20 22:29:01.057759 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.057729 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:29:01.147484 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.147438 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-util\") pod \"cd73bcf6-1d4a-4143-a103-46121f100a43\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " Apr 20 22:29:01.147649 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.147520 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fgbq\" (UniqueName: \"kubernetes.io/projected/cd73bcf6-1d4a-4143-a103-46121f100a43-kube-api-access-2fgbq\") pod \"cd73bcf6-1d4a-4143-a103-46121f100a43\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " Apr 20 22:29:01.147649 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.147625 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-bundle\") pod \"cd73bcf6-1d4a-4143-a103-46121f100a43\" (UID: \"cd73bcf6-1d4a-4143-a103-46121f100a43\") " Apr 20 22:29:01.148040 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.148004 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-bundle" (OuterVolumeSpecName: "bundle") pod "cd73bcf6-1d4a-4143-a103-46121f100a43" (UID: "cd73bcf6-1d4a-4143-a103-46121f100a43"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:29:01.149768 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.149738 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd73bcf6-1d4a-4143-a103-46121f100a43-kube-api-access-2fgbq" (OuterVolumeSpecName: "kube-api-access-2fgbq") pod "cd73bcf6-1d4a-4143-a103-46121f100a43" (UID: "cd73bcf6-1d4a-4143-a103-46121f100a43"). InnerVolumeSpecName "kube-api-access-2fgbq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:29:01.154500 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.154425 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-util" (OuterVolumeSpecName: "util") pod "cd73bcf6-1d4a-4143-a103-46121f100a43" (UID: "cd73bcf6-1d4a-4143-a103-46121f100a43"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:29:01.248544 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.248456 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:01.248544 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.248490 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd73bcf6-1d4a-4143-a103-46121f100a43-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:01.248544 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.248502 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2fgbq\" (UniqueName: \"kubernetes.io/projected/cd73bcf6-1d4a-4143-a103-46121f100a43-kube-api-access-2fgbq\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:01.936315 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.936274 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" event={"ID":"cd73bcf6-1d4a-4143-a103-46121f100a43","Type":"ContainerDied","Data":"ba72f61a9d6de37c0c58e971376f6391450da189767515df0e952ebc8642cf47"} Apr 20 22:29:01.936315 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.936319 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba72f61a9d6de37c0c58e971376f6391450da189767515df0e952ebc8642cf47" Apr 20 22:29:01.936764 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:01.936330 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frhs49" Apr 20 22:29:06.939200 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:06.939170 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-nxnw5" Apr 20 22:29:19.107607 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.107552 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v"] Apr 20 22:29:19.108092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.107850 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerName="util" Apr 20 22:29:19.108092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.107863 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerName="util" Apr 20 22:29:19.108092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.107875 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerName="extract" Apr 20 22:29:19.108092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.107880 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerName="extract" Apr 20 22:29:19.108092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.107890 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerName="pull" Apr 20 22:29:19.108092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.107896 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerName="pull" Apr 20 22:29:19.108092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.107938 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd73bcf6-1d4a-4143-a103-46121f100a43" containerName="extract" Apr 20 22:29:19.110851 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.110833 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.113319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.113297 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4psch\"" Apr 20 22:29:19.113580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.113564 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 22:29:19.114514 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.114498 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 22:29:19.121799 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.121771 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v"] Apr 20 22:29:19.183430 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.183393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.183430 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.183443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.183667 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.183487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pl6p\" (UniqueName: \"kubernetes.io/projected/c0e54224-6249-4939-b00c-eca6a84a08f3-kube-api-access-6pl6p\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.284360 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.284312 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.284533 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.284400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pl6p\" (UniqueName: \"kubernetes.io/projected/c0e54224-6249-4939-b00c-eca6a84a08f3-kube-api-access-6pl6p\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.284533 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.284458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.284830 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.284808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.284878 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.284816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.292821 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.292794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pl6p\" (UniqueName: \"kubernetes.io/projected/c0e54224-6249-4939-b00c-eca6a84a08f3-kube-api-access-6pl6p\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.421921 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.421824 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:19.542577 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.542553 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v"] Apr 20 22:29:19.544600 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:29:19.544569 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0e54224_6249_4939_b00c_eca6a84a08f3.slice/crio-250377f1f31c387434e15e42152ded4b18caff4eb610378155bcc1b38444f781 WatchSource:0}: Error finding container 250377f1f31c387434e15e42152ded4b18caff4eb610378155bcc1b38444f781: Status 404 returned error can't find the container with id 250377f1f31c387434e15e42152ded4b18caff4eb610378155bcc1b38444f781 Apr 20 22:29:19.993440 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.993398 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerID="22f784fa5f170fb4372ba343be5126dd2eb8ff8802b4c38211b9127a019bd690" exitCode=0 Apr 20 22:29:19.993618 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.993480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" event={"ID":"c0e54224-6249-4939-b00c-eca6a84a08f3","Type":"ContainerDied","Data":"22f784fa5f170fb4372ba343be5126dd2eb8ff8802b4c38211b9127a019bd690"} Apr 20 22:29:19.993618 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:19.993525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" event={"ID":"c0e54224-6249-4939-b00c-eca6a84a08f3","Type":"ContainerStarted","Data":"250377f1f31c387434e15e42152ded4b18caff4eb610378155bcc1b38444f781"} Apr 20 22:29:20.997730 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:20.997627 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerID="09f7bc15c3a1c4a1a5d77604e11024336776a16b148ec4294352ea92f357fc8d" exitCode=0 Apr 20 22:29:20.997730 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:20.997704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" event={"ID":"c0e54224-6249-4939-b00c-eca6a84a08f3","Type":"ContainerDied","Data":"09f7bc15c3a1c4a1a5d77604e11024336776a16b148ec4294352ea92f357fc8d"} Apr 20 22:29:22.002603 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:22.002565 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerID="2b7f92b5abc55e69a7b33939bfd2dc961154cccaabed22156620c79e004e5b55" exitCode=0 Apr 20 22:29:22.003014 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:22.002647 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" event={"ID":"c0e54224-6249-4939-b00c-eca6a84a08f3","Type":"ContainerDied","Data":"2b7f92b5abc55e69a7b33939bfd2dc961154cccaabed22156620c79e004e5b55"} Apr 20 22:29:23.133709 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.133666 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:23.217169 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.217139 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pl6p\" (UniqueName: \"kubernetes.io/projected/c0e54224-6249-4939-b00c-eca6a84a08f3-kube-api-access-6pl6p\") pod \"c0e54224-6249-4939-b00c-eca6a84a08f3\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " Apr 20 22:29:23.217356 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.217193 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-bundle\") pod \"c0e54224-6249-4939-b00c-eca6a84a08f3\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " Apr 20 22:29:23.217356 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.217239 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-util\") pod \"c0e54224-6249-4939-b00c-eca6a84a08f3\" (UID: \"c0e54224-6249-4939-b00c-eca6a84a08f3\") " Apr 20 22:29:23.217896 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.217873 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-bundle" (OuterVolumeSpecName: "bundle") pod "c0e54224-6249-4939-b00c-eca6a84a08f3" (UID: "c0e54224-6249-4939-b00c-eca6a84a08f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:29:23.219366 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.219338 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e54224-6249-4939-b00c-eca6a84a08f3-kube-api-access-6pl6p" (OuterVolumeSpecName: "kube-api-access-6pl6p") pod "c0e54224-6249-4939-b00c-eca6a84a08f3" (UID: "c0e54224-6249-4939-b00c-eca6a84a08f3"). InnerVolumeSpecName "kube-api-access-6pl6p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:29:23.223239 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.223215 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-util" (OuterVolumeSpecName: "util") pod "c0e54224-6249-4939-b00c-eca6a84a08f3" (UID: "c0e54224-6249-4939-b00c-eca6a84a08f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:29:23.318065 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.317960 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:23.318065 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.318009 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6pl6p\" (UniqueName: \"kubernetes.io/projected/c0e54224-6249-4939-b00c-eca6a84a08f3-kube-api-access-6pl6p\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:23.318065 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:23.318024 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e54224-6249-4939-b00c-eca6a84a08f3-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:24.015233 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:24.015195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" event={"ID":"c0e54224-6249-4939-b00c-eca6a84a08f3","Type":"ContainerDied","Data":"250377f1f31c387434e15e42152ded4b18caff4eb610378155bcc1b38444f781"} Apr 20 22:29:24.015233 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:24.015229 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="250377f1f31c387434e15e42152ded4b18caff4eb610378155bcc1b38444f781" Apr 20 22:29:24.015439 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:24.015253 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5j7t7v" Apr 20 22:29:28.347686 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.347644 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd"] Apr 20 22:29:28.348319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.348062 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerName="util" Apr 20 22:29:28.348319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.348080 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerName="util" Apr 20 22:29:28.348319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.348103 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerName="extract" Apr 20 22:29:28.348319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.348111 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerName="extract" Apr 20 22:29:28.348319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.348121 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerName="pull" Apr 20 22:29:28.348319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.348129 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerName="pull" Apr 20 22:29:28.348319 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.348204 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0e54224-6249-4939-b00c-eca6a84a08f3" containerName="extract" Apr 20 22:29:28.354259 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.354230 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.356904 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.356873 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 22:29:28.357051 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.356904 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4psch\"" Apr 20 22:29:28.357051 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.356875 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 22:29:28.360122 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.360080 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd"] Apr 20 22:29:28.460462 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.460428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.460663 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.460477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5qdh\" (UniqueName: \"kubernetes.io/projected/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-kube-api-access-d5qdh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.460663 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.460579 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.561776 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.561730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5qdh\" (UniqueName: \"kubernetes.io/projected/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-kube-api-access-d5qdh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.561981 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.561804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.561981 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.561866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.562193 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.562170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.562271 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.562246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.575140 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.575106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5qdh\" (UniqueName: \"kubernetes.io/projected/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-kube-api-access-d5qdh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.664955 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.664914 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:28.799847 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:28.799806 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd"] Apr 20 22:29:28.802759 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:29:28.802733 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6ed4f0_8a6a_480d_bc09_3f176553ce09.slice/crio-dd0a1d5ec4b07517857efa5d5b530b81a84d4b6ce3cbe0e45470d4cc1e9b9e87 WatchSource:0}: Error finding container dd0a1d5ec4b07517857efa5d5b530b81a84d4b6ce3cbe0e45470d4cc1e9b9e87: Status 404 returned error can't find the container with id dd0a1d5ec4b07517857efa5d5b530b81a84d4b6ce3cbe0e45470d4cc1e9b9e87 Apr 20 22:29:29.031027 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:29.030933 2575 generic.go:358] "Generic (PLEG): container finished" podID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerID="c3db747f309926e15f9b160b5d747d9ed84a5e0be73ce2c060be5fe372c0a039" exitCode=0 Apr 20 22:29:29.031027 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:29.030984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" event={"ID":"3f6ed4f0-8a6a-480d-bc09-3f176553ce09","Type":"ContainerDied","Data":"c3db747f309926e15f9b160b5d747d9ed84a5e0be73ce2c060be5fe372c0a039"} Apr 20 22:29:29.031027 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:29.031006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" event={"ID":"3f6ed4f0-8a6a-480d-bc09-3f176553ce09","Type":"ContainerStarted","Data":"dd0a1d5ec4b07517857efa5d5b530b81a84d4b6ce3cbe0e45470d4cc1e9b9e87"} Apr 20 22:29:30.040982 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.040946 2575 generic.go:358] "Generic (PLEG): container finished" podID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerID="9befb007de27b6dfe6aa2602c1954c783098db65ff3a58dcfecdc27b18ab0d62" exitCode=0 Apr 20 22:29:30.041364 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.041029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" event={"ID":"3f6ed4f0-8a6a-480d-bc09-3f176553ce09","Type":"ContainerDied","Data":"9befb007de27b6dfe6aa2602c1954c783098db65ff3a58dcfecdc27b18ab0d62"} Apr 20 22:29:30.192183 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.192148 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf"] Apr 20 22:29:30.195513 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.195489 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.198374 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.198343 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 22:29:30.198374 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.198363 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 22:29:30.198695 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.198655 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 22:29:30.198913 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.198897 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-q26g9\"" Apr 20 22:29:30.198965 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.198910 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 22:29:30.207558 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.207537 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf"] Apr 20 22:29:30.378392 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.378360 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdxkk\" (UniqueName: \"kubernetes.io/projected/8fe52c31-d1e3-4af9-92c0-4bae8f481ac3-kube-api-access-vdxkk\") pod \"opendatahub-operator-controller-manager-5d8d569d47-4c2vf\" (UID: \"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.378558 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.378404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8fe52c31-d1e3-4af9-92c0-4bae8f481ac3-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-4c2vf\" (UID: \"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.378558 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.378427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8fe52c31-d1e3-4af9-92c0-4bae8f481ac3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-4c2vf\" (UID: \"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.479815 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.479776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdxkk\" (UniqueName: \"kubernetes.io/projected/8fe52c31-d1e3-4af9-92c0-4bae8f481ac3-kube-api-access-vdxkk\") pod \"opendatahub-operator-controller-manager-5d8d569d47-4c2vf\" (UID: \"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.479992 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.479822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8fe52c31-d1e3-4af9-92c0-4bae8f481ac3-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-4c2vf\" (UID: \"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.479992 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.479845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8fe52c31-d1e3-4af9-92c0-4bae8f481ac3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-4c2vf\" (UID: \"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.482426 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.482394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8fe52c31-d1e3-4af9-92c0-4bae8f481ac3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-4c2vf\" (UID: \"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.482539 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.482434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8fe52c31-d1e3-4af9-92c0-4bae8f481ac3-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-4c2vf\" (UID: \"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.487878 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.487851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdxkk\" (UniqueName: \"kubernetes.io/projected/8fe52c31-d1e3-4af9-92c0-4bae8f481ac3-kube-api-access-vdxkk\") pod \"opendatahub-operator-controller-manager-5d8d569d47-4c2vf\" (UID: \"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.506817 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.506789 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:30.653407 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.653379 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf"] Apr 20 22:29:30.655849 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:29:30.655807 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fe52c31_d1e3_4af9_92c0_4bae8f481ac3.slice/crio-6fec4fcc540c68cfbc0bdd98abf66486ef1cae3cf4a4e353750b62d288a3c916 WatchSource:0}: Error finding container 6fec4fcc540c68cfbc0bdd98abf66486ef1cae3cf4a4e353750b62d288a3c916: Status 404 returned error can't find the container with id 6fec4fcc540c68cfbc0bdd98abf66486ef1cae3cf4a4e353750b62d288a3c916 Apr 20 22:29:30.837116 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.837069 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5"] Apr 20 22:29:30.841703 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.841665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:30.844125 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.844101 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 22:29:30.844526 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.844511 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 22:29:30.844622 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.844514 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 22:29:30.844622 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.844561 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 22:29:30.844622 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.844514 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:29:30.844622 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.844557 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-vdml4\"" Apr 20 22:29:30.848482 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.848459 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5"] Apr 20 22:29:30.889404 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.889378 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:29:30.889825 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.889804 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:29:30.893246 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.893225 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 22:29:30.983174 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.982964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-cert\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:30.983174 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.983043 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kc4w\" (UniqueName: \"kubernetes.io/projected/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-kube-api-access-7kc4w\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:30.983174 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.983090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-metrics-cert\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:30.983174 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:30.983129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-manager-config\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.045666 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.045633 2575 generic.go:358] "Generic (PLEG): container finished" podID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerID="803e9c85742283434bc4667af9308f9a6db0032859c007d9f65df28c5d3f3b8d" exitCode=0 Apr 20 22:29:31.046092 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.045712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" event={"ID":"3f6ed4f0-8a6a-480d-bc09-3f176553ce09","Type":"ContainerDied","Data":"803e9c85742283434bc4667af9308f9a6db0032859c007d9f65df28c5d3f3b8d"} Apr 20 22:29:31.049847 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.049803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" event={"ID":"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3","Type":"ContainerStarted","Data":"6fec4fcc540c68cfbc0bdd98abf66486ef1cae3cf4a4e353750b62d288a3c916"} Apr 20 22:29:31.084117 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.084084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-cert\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.084117 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.084124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kc4w\" (UniqueName: \"kubernetes.io/projected/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-kube-api-access-7kc4w\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.084334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.084161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-metrics-cert\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.084334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.084191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-manager-config\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.085118 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.085092 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-manager-config\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.086824 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.086799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-cert\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.086945 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.086892 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-metrics-cert\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.093433 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.093397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kc4w\" (UniqueName: \"kubernetes.io/projected/4ffb2102-620d-4c58-87e1-76e2bc0cb75b-kube-api-access-7kc4w\") pod \"lws-controller-manager-845776cd66-9p9v5\" (UID: \"4ffb2102-620d-4c58-87e1-76e2bc0cb75b\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.151821 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.151777 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:31.290731 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:31.290694 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5"] Apr 20 22:29:31.294311 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:29:31.294280 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ffb2102_620d_4c58_87e1_76e2bc0cb75b.slice/crio-3e54c588b694e1c6b844703397d83c32ee10ab67a43e417a3713507d512685ad WatchSource:0}: Error finding container 3e54c588b694e1c6b844703397d83c32ee10ab67a43e417a3713507d512685ad: Status 404 returned error can't find the container with id 3e54c588b694e1c6b844703397d83c32ee10ab67a43e417a3713507d512685ad Apr 20 22:29:32.055754 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:32.055711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" event={"ID":"4ffb2102-620d-4c58-87e1-76e2bc0cb75b","Type":"ContainerStarted","Data":"3e54c588b694e1c6b844703397d83c32ee10ab67a43e417a3713507d512685ad"} Apr 20 22:29:32.721012 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:32.720986 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:32.901992 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:32.901956 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-util\") pod \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " Apr 20 22:29:32.902173 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:32.902013 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-bundle\") pod \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " Apr 20 22:29:32.902173 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:32.902054 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5qdh\" (UniqueName: \"kubernetes.io/projected/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-kube-api-access-d5qdh\") pod \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\" (UID: \"3f6ed4f0-8a6a-480d-bc09-3f176553ce09\") " Apr 20 22:29:32.903464 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:32.903427 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-bundle" (OuterVolumeSpecName: "bundle") pod "3f6ed4f0-8a6a-480d-bc09-3f176553ce09" (UID: "3f6ed4f0-8a6a-480d-bc09-3f176553ce09"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:29:32.905033 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:32.905000 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-kube-api-access-d5qdh" (OuterVolumeSpecName: "kube-api-access-d5qdh") pod "3f6ed4f0-8a6a-480d-bc09-3f176553ce09" (UID: "3f6ed4f0-8a6a-480d-bc09-3f176553ce09"). InnerVolumeSpecName "kube-api-access-d5qdh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:29:32.909985 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:32.909941 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-util" (OuterVolumeSpecName: "util") pod "3f6ed4f0-8a6a-480d-bc09-3f176553ce09" (UID: "3f6ed4f0-8a6a-480d-bc09-3f176553ce09"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:29:33.003143 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:33.003061 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:33.003143 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:33.003102 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:33.003143 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:33.003119 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5qdh\" (UniqueName: \"kubernetes.io/projected/3f6ed4f0-8a6a-480d-bc09-3f176553ce09-kube-api-access-d5qdh\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:33.063069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:33.062464 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" Apr 20 22:29:33.063069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:33.062472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xhjcd" event={"ID":"3f6ed4f0-8a6a-480d-bc09-3f176553ce09","Type":"ContainerDied","Data":"dd0a1d5ec4b07517857efa5d5b530b81a84d4b6ce3cbe0e45470d4cc1e9b9e87"} Apr 20 22:29:33.063069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:33.062507 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd0a1d5ec4b07517857efa5d5b530b81a84d4b6ce3cbe0e45470d4cc1e9b9e87" Apr 20 22:29:34.066807 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:34.066712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" event={"ID":"8fe52c31-d1e3-4af9-92c0-4bae8f481ac3","Type":"ContainerStarted","Data":"718d6605f8122455b41ec03e9ef333210ffb775760782ea7db0bf513261a457b"} Apr 20 22:29:34.067270 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:34.066836 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:34.067993 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:34.067971 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" event={"ID":"4ffb2102-620d-4c58-87e1-76e2bc0cb75b","Type":"ContainerStarted","Data":"e2233663bf2e4bf93c3e6ce8a2aaf5aa0822dfd9661feb44f954a38279377018"} Apr 20 22:29:34.068128 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:34.068115 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:34.094008 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:34.093954 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" podStartSLOduration=1.3366533440000001 podStartE2EDuration="4.093938333s" podCreationTimestamp="2026-04-20 22:29:30 +0000 UTC" firstStartedPulling="2026-04-20 22:29:30.658018137 +0000 UTC m=+300.272174673" lastFinishedPulling="2026-04-20 22:29:33.415303123 +0000 UTC m=+303.029459662" observedRunningTime="2026-04-20 22:29:34.092476168 +0000 UTC m=+303.706632751" watchObservedRunningTime="2026-04-20 22:29:34.093938333 +0000 UTC m=+303.708094892" Apr 20 22:29:34.111998 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:34.111942 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" podStartSLOduration=1.619227261 podStartE2EDuration="4.111924974s" podCreationTimestamp="2026-04-20 22:29:30 +0000 UTC" firstStartedPulling="2026-04-20 22:29:31.297079647 +0000 UTC m=+300.911236186" lastFinishedPulling="2026-04-20 22:29:33.789777361 +0000 UTC m=+303.403933899" observedRunningTime="2026-04-20 22:29:34.109888385 +0000 UTC m=+303.724044946" watchObservedRunningTime="2026-04-20 22:29:34.111924974 +0000 UTC m=+303.726081534" Apr 20 22:29:45.073800 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:45.073769 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-845776cd66-9p9v5" Apr 20 22:29:45.074174 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:45.073923 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-4c2vf" Apr 20 22:29:47.728101 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.728061 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll"] Apr 20 22:29:47.728566 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.728469 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerName="pull" Apr 20 22:29:47.728566 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.728486 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerName="pull" Apr 20 22:29:47.728566 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.728505 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerName="extract" Apr 20 22:29:47.728566 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.728513 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerName="extract" Apr 20 22:29:47.728566 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.728530 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerName="util" Apr 20 22:29:47.728566 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.728539 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerName="util" Apr 20 22:29:47.728901 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.728599 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f6ed4f0-8a6a-480d-bc09-3f176553ce09" containerName="extract" Apr 20 22:29:47.730746 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.730725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:47.733808 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.733786 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 22:29:47.734915 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.734895 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 22:29:47.735005 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.734895 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4psch\"" Apr 20 22:29:47.740291 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.740259 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll"] Apr 20 22:29:47.821062 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.821015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbht\" (UniqueName: \"kubernetes.io/projected/865ed3ce-9225-440c-97de-831e897d9f30-kube-api-access-jdbht\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:47.821062 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.821065 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:47.821277 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.821131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:47.922582 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.922537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbht\" (UniqueName: \"kubernetes.io/projected/865ed3ce-9225-440c-97de-831e897d9f30-kube-api-access-jdbht\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:47.922582 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.922586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:47.922835 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.922724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:47.922983 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.922965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:47.923074 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.923056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:47.931335 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:47.931311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbht\" (UniqueName: \"kubernetes.io/projected/865ed3ce-9225-440c-97de-831e897d9f30-kube-api-access-jdbht\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:48.040388 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:48.040294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:48.176031 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:48.175872 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll"] Apr 20 22:29:48.178262 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:29:48.178231 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865ed3ce_9225_440c_97de_831e897d9f30.slice/crio-6ea5fd179eb1defacd5f576a5d252f570061bf7065dfc9f5125a569db6ea7208 WatchSource:0}: Error finding container 6ea5fd179eb1defacd5f576a5d252f570061bf7065dfc9f5125a569db6ea7208: Status 404 returned error can't find the container with id 6ea5fd179eb1defacd5f576a5d252f570061bf7065dfc9f5125a569db6ea7208 Apr 20 22:29:48.180034 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:48.180019 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:29:49.116865 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:49.116831 2575 generic.go:358] "Generic (PLEG): container finished" podID="865ed3ce-9225-440c-97de-831e897d9f30" containerID="62fe383ffa82d1d169530bc689d7daa9b1f94e1cded69886deea09e679224ba0" exitCode=0 Apr 20 22:29:49.117255 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:49.116916 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" event={"ID":"865ed3ce-9225-440c-97de-831e897d9f30","Type":"ContainerDied","Data":"62fe383ffa82d1d169530bc689d7daa9b1f94e1cded69886deea09e679224ba0"} Apr 20 22:29:49.117255 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:49.116945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" event={"ID":"865ed3ce-9225-440c-97de-831e897d9f30","Type":"ContainerStarted","Data":"6ea5fd179eb1defacd5f576a5d252f570061bf7065dfc9f5125a569db6ea7208"} Apr 20 22:29:50.121349 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:50.121311 2575 generic.go:358] "Generic (PLEG): container finished" podID="865ed3ce-9225-440c-97de-831e897d9f30" containerID="c24870ded9ed702facd5fa06479df4d33bc5a791fdb60d46a4c38147359336ab" exitCode=0 Apr 20 22:29:50.121853 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:50.121398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" event={"ID":"865ed3ce-9225-440c-97de-831e897d9f30","Type":"ContainerDied","Data":"c24870ded9ed702facd5fa06479df4d33bc5a791fdb60d46a4c38147359336ab"} Apr 20 22:29:50.894653 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:50.894619 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-cbn82"] Apr 20 22:29:50.896847 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:50.896830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:50.899182 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:50.899158 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 22:29:50.899305 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:50.899282 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-v6mds\"" Apr 20 22:29:50.906873 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:50.906842 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-cbn82"] Apr 20 22:29:50.945977 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:50.945944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1739d41-72c6-4c98-9382-37a93d872743-cert\") pod \"odh-model-controller-858dbf95b8-cbn82\" (UID: \"b1739d41-72c6-4c98-9382-37a93d872743\") " pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:50.945977 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:50.945979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmk2\" (UniqueName: \"kubernetes.io/projected/b1739d41-72c6-4c98-9382-37a93d872743-kube-api-access-5rmk2\") pod \"odh-model-controller-858dbf95b8-cbn82\" (UID: \"b1739d41-72c6-4c98-9382-37a93d872743\") " pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:51.046832 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:51.046786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmk2\" (UniqueName: \"kubernetes.io/projected/b1739d41-72c6-4c98-9382-37a93d872743-kube-api-access-5rmk2\") pod \"odh-model-controller-858dbf95b8-cbn82\" (UID: \"b1739d41-72c6-4c98-9382-37a93d872743\") " pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:51.047022 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:51.046912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1739d41-72c6-4c98-9382-37a93d872743-cert\") pod \"odh-model-controller-858dbf95b8-cbn82\" (UID: \"b1739d41-72c6-4c98-9382-37a93d872743\") " pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:51.047022 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:29:51.047009 2575 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 22:29:51.047091 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:29:51.047068 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1739d41-72c6-4c98-9382-37a93d872743-cert podName:b1739d41-72c6-4c98-9382-37a93d872743 nodeName:}" failed. No retries permitted until 2026-04-20 22:29:51.54705024 +0000 UTC m=+321.161206778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1739d41-72c6-4c98-9382-37a93d872743-cert") pod "odh-model-controller-858dbf95b8-cbn82" (UID: "b1739d41-72c6-4c98-9382-37a93d872743") : secret "odh-model-controller-webhook-cert" not found Apr 20 22:29:51.061397 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:51.061363 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmk2\" (UniqueName: \"kubernetes.io/projected/b1739d41-72c6-4c98-9382-37a93d872743-kube-api-access-5rmk2\") pod \"odh-model-controller-858dbf95b8-cbn82\" (UID: \"b1739d41-72c6-4c98-9382-37a93d872743\") " pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:51.126694 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:51.126643 2575 generic.go:358] "Generic (PLEG): container finished" podID="865ed3ce-9225-440c-97de-831e897d9f30" containerID="92e9569f80912de480cee2add27e3ad23f9c5edeb4f7f64c97aecc1307f5e0a0" exitCode=0 Apr 20 22:29:51.127069 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:51.126705 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" event={"ID":"865ed3ce-9225-440c-97de-831e897d9f30","Type":"ContainerDied","Data":"92e9569f80912de480cee2add27e3ad23f9c5edeb4f7f64c97aecc1307f5e0a0"} Apr 20 22:29:51.550940 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:51.550899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1739d41-72c6-4c98-9382-37a93d872743-cert\") pod \"odh-model-controller-858dbf95b8-cbn82\" (UID: \"b1739d41-72c6-4c98-9382-37a93d872743\") " pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:51.551134 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:29:51.551022 2575 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 22:29:51.551134 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:29:51.551076 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1739d41-72c6-4c98-9382-37a93d872743-cert podName:b1739d41-72c6-4c98-9382-37a93d872743 nodeName:}" failed. No retries permitted until 2026-04-20 22:29:52.551061443 +0000 UTC m=+322.165217981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1739d41-72c6-4c98-9382-37a93d872743-cert") pod "odh-model-controller-858dbf95b8-cbn82" (UID: "b1739d41-72c6-4c98-9382-37a93d872743") : secret "odh-model-controller-webhook-cert" not found Apr 20 22:29:52.251972 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.251947 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:52.358091 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.358047 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdbht\" (UniqueName: \"kubernetes.io/projected/865ed3ce-9225-440c-97de-831e897d9f30-kube-api-access-jdbht\") pod \"865ed3ce-9225-440c-97de-831e897d9f30\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " Apr 20 22:29:52.358091 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.358099 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-util\") pod \"865ed3ce-9225-440c-97de-831e897d9f30\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " Apr 20 22:29:52.358323 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.358148 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-bundle\") pod \"865ed3ce-9225-440c-97de-831e897d9f30\" (UID: \"865ed3ce-9225-440c-97de-831e897d9f30\") " Apr 20 22:29:52.359150 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.359119 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-bundle" (OuterVolumeSpecName: "bundle") pod "865ed3ce-9225-440c-97de-831e897d9f30" (UID: "865ed3ce-9225-440c-97de-831e897d9f30"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:29:52.360329 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.360303 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865ed3ce-9225-440c-97de-831e897d9f30-kube-api-access-jdbht" (OuterVolumeSpecName: "kube-api-access-jdbht") pod "865ed3ce-9225-440c-97de-831e897d9f30" (UID: "865ed3ce-9225-440c-97de-831e897d9f30"). InnerVolumeSpecName "kube-api-access-jdbht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:29:52.366274 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.366244 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-util" (OuterVolumeSpecName: "util") pod "865ed3ce-9225-440c-97de-831e897d9f30" (UID: "865ed3ce-9225-440c-97de-831e897d9f30"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:29:52.459528 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.459436 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:52.459528 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.459465 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jdbht\" (UniqueName: \"kubernetes.io/projected/865ed3ce-9225-440c-97de-831e897d9f30-kube-api-access-jdbht\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:52.459528 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.459475 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/865ed3ce-9225-440c-97de-831e897d9f30-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:29:52.560216 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.560174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1739d41-72c6-4c98-9382-37a93d872743-cert\") pod \"odh-model-controller-858dbf95b8-cbn82\" (UID: \"b1739d41-72c6-4c98-9382-37a93d872743\") " pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:52.562788 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.562758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1739d41-72c6-4c98-9382-37a93d872743-cert\") pod \"odh-model-controller-858dbf95b8-cbn82\" (UID: \"b1739d41-72c6-4c98-9382-37a93d872743\") " pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:52.708111 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.708071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:29:52.830572 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:52.830549 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-cbn82"] Apr 20 22:29:52.832658 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:29:52.832633 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1739d41_72c6_4c98_9382_37a93d872743.slice/crio-c75105ba74dada4ff882931404c9f5f919a5102b5cb031e9212ff474dc49610d WatchSource:0}: Error finding container c75105ba74dada4ff882931404c9f5f919a5102b5cb031e9212ff474dc49610d: Status 404 returned error can't find the container with id c75105ba74dada4ff882931404c9f5f919a5102b5cb031e9212ff474dc49610d Apr 20 22:29:53.137164 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:53.137132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" event={"ID":"865ed3ce-9225-440c-97de-831e897d9f30","Type":"ContainerDied","Data":"6ea5fd179eb1defacd5f576a5d252f570061bf7065dfc9f5125a569db6ea7208"} Apr 20 22:29:53.137164 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:53.137165 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ea5fd179eb1defacd5f576a5d252f570061bf7065dfc9f5125a569db6ea7208" Apr 20 22:29:53.137417 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:53.137168 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nhlll" Apr 20 22:29:53.138304 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:53.138279 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" event={"ID":"b1739d41-72c6-4c98-9382-37a93d872743","Type":"ContainerStarted","Data":"c75105ba74dada4ff882931404c9f5f919a5102b5cb031e9212ff474dc49610d"} Apr 20 22:29:56.153244 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:56.153209 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1739d41-72c6-4c98-9382-37a93d872743" containerID="228ffb8b701b3585620e198c9a911e780106bcf8555fbedbaa62ff33bdb52da0" exitCode=1 Apr 20 22:29:56.153745 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:56.153259 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" event={"ID":"b1739d41-72c6-4c98-9382-37a93d872743","Type":"ContainerDied","Data":"228ffb8b701b3585620e198c9a911e780106bcf8555fbedbaa62ff33bdb52da0"} Apr 20 22:29:56.153745 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:56.153569 2575 scope.go:117] "RemoveContainer" containerID="228ffb8b701b3585620e198c9a911e780106bcf8555fbedbaa62ff33bdb52da0" Apr 20 22:29:57.158835 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:57.158798 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1739d41-72c6-4c98-9382-37a93d872743" containerID="3d5556d1c82241240f46ba790f926f959e2ed4f9919434d9b5d759189ea117b7" exitCode=1 Apr 20 22:29:57.159505 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:57.158901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" event={"ID":"b1739d41-72c6-4c98-9382-37a93d872743","Type":"ContainerDied","Data":"3d5556d1c82241240f46ba790f926f959e2ed4f9919434d9b5d759189ea117b7"} Apr 20 22:29:57.159505 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:57.158984 2575 scope.go:117] "RemoveContainer" containerID="228ffb8b701b3585620e198c9a911e780106bcf8555fbedbaa62ff33bdb52da0" Apr 20 22:29:57.159505 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:57.159185 2575 scope.go:117] "RemoveContainer" containerID="3d5556d1c82241240f46ba790f926f959e2ed4f9919434d9b5d759189ea117b7" Apr 20 22:29:57.159505 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:29:57.159408 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-cbn82_opendatahub(b1739d41-72c6-4c98-9382-37a93d872743)\"" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" podUID="b1739d41-72c6-4c98-9382-37a93d872743" Apr 20 22:29:58.163744 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:29:58.163717 2575 scope.go:117] "RemoveContainer" containerID="3d5556d1c82241240f46ba790f926f959e2ed4f9919434d9b5d759189ea117b7" Apr 20 22:29:58.164082 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:29:58.163906 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-cbn82_opendatahub(b1739d41-72c6-4c98-9382-37a93d872743)\"" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" podUID="b1739d41-72c6-4c98-9382-37a93d872743" Apr 20 22:30:02.029418 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.029382 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf"] Apr 20 22:30:02.029927 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.029706 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865ed3ce-9225-440c-97de-831e897d9f30" containerName="extract" Apr 20 22:30:02.029927 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.029723 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="865ed3ce-9225-440c-97de-831e897d9f30" containerName="extract" Apr 20 22:30:02.029927 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.029739 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865ed3ce-9225-440c-97de-831e897d9f30" containerName="pull" Apr 20 22:30:02.029927 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.029747 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="865ed3ce-9225-440c-97de-831e897d9f30" containerName="pull" Apr 20 22:30:02.029927 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.029758 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865ed3ce-9225-440c-97de-831e897d9f30" containerName="util" Apr 20 22:30:02.029927 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.029764 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="865ed3ce-9225-440c-97de-831e897d9f30" containerName="util" Apr 20 22:30:02.029927 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.029822 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="865ed3ce-9225-440c-97de-831e897d9f30" containerName="extract" Apr 20 22:30:02.032407 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.032391 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.039152 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.039127 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 22:30:02.039284 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.039178 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 22:30:02.040288 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.040270 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4psch\"" Apr 20 22:30:02.071343 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.071311 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf"] Apr 20 22:30:02.145407 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.145369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.145589 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.145432 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.145589 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.145499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbnzb\" (UniqueName: \"kubernetes.io/projected/39559733-e85d-4572-ad6c-94d44f5cb2bd-kube-api-access-xbnzb\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.246007 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.245969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.246177 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.246048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.246177 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.246073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbnzb\" (UniqueName: \"kubernetes.io/projected/39559733-e85d-4572-ad6c-94d44f5cb2bd-kube-api-access-xbnzb\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.246403 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.246385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.246448 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.246411 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.262959 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.262921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbnzb\" (UniqueName: \"kubernetes.io/projected/39559733-e85d-4572-ad6c-94d44f5cb2bd-kube-api-access-xbnzb\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.341117 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.341023 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:02.502292 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:30:02.502253 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39559733_e85d_4572_ad6c_94d44f5cb2bd.slice/crio-6e968aecb6168c4797a14311105a4897f5b8c3b4609d92b9fc9e68373141145d WatchSource:0}: Error finding container 6e968aecb6168c4797a14311105a4897f5b8c3b4609d92b9fc9e68373141145d: Status 404 returned error can't find the container with id 6e968aecb6168c4797a14311105a4897f5b8c3b4609d92b9fc9e68373141145d Apr 20 22:30:02.504984 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.504960 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf"] Apr 20 22:30:02.708885 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.708842 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:30:02.709283 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:02.709270 2575 scope.go:117] "RemoveContainer" containerID="3d5556d1c82241240f46ba790f926f959e2ed4f9919434d9b5d759189ea117b7" Apr 20 22:30:02.709469 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:30:02.709450 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-cbn82_opendatahub(b1739d41-72c6-4c98-9382-37a93d872743)\"" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" podUID="b1739d41-72c6-4c98-9382-37a93d872743" Apr 20 22:30:03.181287 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:03.181256 2575 generic.go:358] "Generic (PLEG): container finished" podID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerID="0e33cd2a226f5009545cb046dba74539a03b5b20eb66e60c8a0f433c020d8a27" exitCode=0 Apr 20 22:30:03.181754 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:03.181353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" event={"ID":"39559733-e85d-4572-ad6c-94d44f5cb2bd","Type":"ContainerDied","Data":"0e33cd2a226f5009545cb046dba74539a03b5b20eb66e60c8a0f433c020d8a27"} Apr 20 22:30:03.181754 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:03.181389 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" event={"ID":"39559733-e85d-4572-ad6c-94d44f5cb2bd","Type":"ContainerStarted","Data":"6e968aecb6168c4797a14311105a4897f5b8c3b4609d92b9fc9e68373141145d"} Apr 20 22:30:08.199371 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:08.199337 2575 generic.go:358] "Generic (PLEG): container finished" podID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerID="fdc0eadc1f67a9c41705ee37c55d0d6c4e949d71147a5e4fddaae154f138ae42" exitCode=0 Apr 20 22:30:08.199774 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:08.199385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" event={"ID":"39559733-e85d-4572-ad6c-94d44f5cb2bd","Type":"ContainerDied","Data":"fdc0eadc1f67a9c41705ee37c55d0d6c4e949d71147a5e4fddaae154f138ae42"} Apr 20 22:30:09.203775 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:09.203740 2575 generic.go:358] "Generic (PLEG): container finished" podID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerID="5e862f486f1775c10bdc6f4426f3d42dcbfd57ba8d6c6f19bcd45899c323f6a0" exitCode=0 Apr 20 22:30:09.204136 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:09.203831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" event={"ID":"39559733-e85d-4572-ad6c-94d44f5cb2bd","Type":"ContainerDied","Data":"5e862f486f1775c10bdc6f4426f3d42dcbfd57ba8d6c6f19bcd45899c323f6a0"} Apr 20 22:30:10.332988 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.332964 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:10.414006 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.413959 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-bundle\") pod \"39559733-e85d-4572-ad6c-94d44f5cb2bd\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " Apr 20 22:30:10.414201 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.414021 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-util\") pod \"39559733-e85d-4572-ad6c-94d44f5cb2bd\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " Apr 20 22:30:10.414201 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.414090 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbnzb\" (UniqueName: \"kubernetes.io/projected/39559733-e85d-4572-ad6c-94d44f5cb2bd-kube-api-access-xbnzb\") pod \"39559733-e85d-4572-ad6c-94d44f5cb2bd\" (UID: \"39559733-e85d-4572-ad6c-94d44f5cb2bd\") " Apr 20 22:30:10.414981 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.414955 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-bundle" (OuterVolumeSpecName: "bundle") pod "39559733-e85d-4572-ad6c-94d44f5cb2bd" (UID: "39559733-e85d-4572-ad6c-94d44f5cb2bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:30:10.416344 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.416308 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39559733-e85d-4572-ad6c-94d44f5cb2bd-kube-api-access-xbnzb" (OuterVolumeSpecName: "kube-api-access-xbnzb") pod "39559733-e85d-4572-ad6c-94d44f5cb2bd" (UID: "39559733-e85d-4572-ad6c-94d44f5cb2bd"). InnerVolumeSpecName "kube-api-access-xbnzb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:30:10.418930 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.418910 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-util" (OuterVolumeSpecName: "util") pod "39559733-e85d-4572-ad6c-94d44f5cb2bd" (UID: "39559733-e85d-4572-ad6c-94d44f5cb2bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:30:10.515234 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.515136 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:30:10.515234 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.515170 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39559733-e85d-4572-ad6c-94d44f5cb2bd-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:30:10.515234 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:10.515181 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbnzb\" (UniqueName: \"kubernetes.io/projected/39559733-e85d-4572-ad6c-94d44f5cb2bd-kube-api-access-xbnzb\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:30:11.212374 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:11.212331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" event={"ID":"39559733-e85d-4572-ad6c-94d44f5cb2bd","Type":"ContainerDied","Data":"6e968aecb6168c4797a14311105a4897f5b8c3b4609d92b9fc9e68373141145d"} Apr 20 22:30:11.212374 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:11.212366 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e968aecb6168c4797a14311105a4897f5b8c3b4609d92b9fc9e68373141145d" Apr 20 22:30:11.212374 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:11.212376 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29wmgf" Apr 20 22:30:12.708798 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:12.708758 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:30:12.709192 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:12.709177 2575 scope.go:117] "RemoveContainer" containerID="3d5556d1c82241240f46ba790f926f959e2ed4f9919434d9b5d759189ea117b7" Apr 20 22:30:13.219950 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:13.219860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" event={"ID":"b1739d41-72c6-4c98-9382-37a93d872743","Type":"ContainerStarted","Data":"81f4cca3416f2fcfc821abbd79a1ca16b4e8660251b94e630394a850c03212b9"} Apr 20 22:30:13.220112 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:13.220067 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:30:13.239885 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:13.239832 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" podStartSLOduration=3.109164909 podStartE2EDuration="23.239817925s" podCreationTimestamp="2026-04-20 22:29:50 +0000 UTC" firstStartedPulling="2026-04-20 22:29:52.833981002 +0000 UTC m=+322.448137539" lastFinishedPulling="2026-04-20 22:30:12.964634004 +0000 UTC m=+342.578790555" observedRunningTime="2026-04-20 22:30:13.237724107 +0000 UTC m=+342.851880666" watchObservedRunningTime="2026-04-20 22:30:13.239817925 +0000 UTC m=+342.853974484" Apr 20 22:30:24.226356 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:24.226323 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-cbn82" Apr 20 22:30:34.308730 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.308684 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk"] Apr 20 22:30:34.309223 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.309029 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerName="util" Apr 20 22:30:34.309223 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.309043 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerName="util" Apr 20 22:30:34.309223 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.309055 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerName="extract" Apr 20 22:30:34.309223 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.309062 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerName="extract" Apr 20 22:30:34.309223 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.309080 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerName="pull" Apr 20 22:30:34.309223 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.309085 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerName="pull" Apr 20 22:30:34.309223 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.309135 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="39559733-e85d-4572-ad6c-94d44f5cb2bd" containerName="extract" Apr 20 22:30:34.313763 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.313740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.318058 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.318034 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 22:30:34.318058 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.318043 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 22:30:34.318242 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.318043 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 22:30:34.318407 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.318392 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 22:30:34.319085 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.319068 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 22:30:34.319170 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.319147 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-jgtbx\"" Apr 20 22:30:34.319296 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.319280 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 22:30:34.338667 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.338635 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk"] Apr 20 22:30:34.415074 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.415036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/39218be9-64cc-4a56-b915-5c794ceaace0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.415074 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.415084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t556\" (UniqueName: \"kubernetes.io/projected/39218be9-64cc-4a56-b915-5c794ceaace0-kube-api-access-9t556\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.415310 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.415107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/39218be9-64cc-4a56-b915-5c794ceaace0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.415310 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.415141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/39218be9-64cc-4a56-b915-5c794ceaace0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.415310 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.415167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/39218be9-64cc-4a56-b915-5c794ceaace0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.415310 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.415193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/39218be9-64cc-4a56-b915-5c794ceaace0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.415310 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.415276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/39218be9-64cc-4a56-b915-5c794ceaace0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.516110 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.516065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/39218be9-64cc-4a56-b915-5c794ceaace0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.516110 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.516110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t556\" (UniqueName: \"kubernetes.io/projected/39218be9-64cc-4a56-b915-5c794ceaace0-kube-api-access-9t556\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.516325 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.516132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/39218be9-64cc-4a56-b915-5c794ceaace0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.516325 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.516156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/39218be9-64cc-4a56-b915-5c794ceaace0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.516325 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.516178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/39218be9-64cc-4a56-b915-5c794ceaace0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.516325 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.516215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/39218be9-64cc-4a56-b915-5c794ceaace0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.516325 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.516270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/39218be9-64cc-4a56-b915-5c794ceaace0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.517126 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.516931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/39218be9-64cc-4a56-b915-5c794ceaace0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.518850 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.518820 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/39218be9-64cc-4a56-b915-5c794ceaace0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.518966 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.518869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/39218be9-64cc-4a56-b915-5c794ceaace0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.518966 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.518909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/39218be9-64cc-4a56-b915-5c794ceaace0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.518966 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.518950 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/39218be9-64cc-4a56-b915-5c794ceaace0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.530976 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.530939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/39218be9-64cc-4a56-b915-5c794ceaace0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.531166 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.531145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t556\" (UniqueName: \"kubernetes.io/projected/39218be9-64cc-4a56-b915-5c794ceaace0-kube-api-access-9t556\") pod \"istiod-openshift-gateway-55ff986f96-d5hdk\" (UID: \"39218be9-64cc-4a56-b915-5c794ceaace0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.623335 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.623292 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:34.773907 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:34.773868 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk"] Apr 20 22:30:34.777978 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:30:34.777943 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39218be9_64cc_4a56_b915_5c794ceaace0.slice/crio-4876279c50ed4976caf907c09d600fa14a4634ac5482567a5bf67ce4283b12fa WatchSource:0}: Error finding container 4876279c50ed4976caf907c09d600fa14a4634ac5482567a5bf67ce4283b12fa: Status 404 returned error can't find the container with id 4876279c50ed4976caf907c09d600fa14a4634ac5482567a5bf67ce4283b12fa Apr 20 22:30:35.297077 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:35.297040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" event={"ID":"39218be9-64cc-4a56-b915-5c794ceaace0","Type":"ContainerStarted","Data":"4876279c50ed4976caf907c09d600fa14a4634ac5482567a5bf67ce4283b12fa"} Apr 20 22:30:37.156450 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:37.156401 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 22:30:37.156759 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:37.156493 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 22:30:37.306136 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:37.306094 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" event={"ID":"39218be9-64cc-4a56-b915-5c794ceaace0","Type":"ContainerStarted","Data":"54696e807044ef7f24faf3d5f7405f7ce5d7ceab9dc98c309a8d93f0265df8a7"} Apr 20 22:30:37.306321 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:37.306298 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:30:37.335808 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:37.335744 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" podStartSLOduration=0.96196294 podStartE2EDuration="3.335724038s" podCreationTimestamp="2026-04-20 22:30:34 +0000 UTC" firstStartedPulling="2026-04-20 22:30:34.782346486 +0000 UTC m=+364.396503023" lastFinishedPulling="2026-04-20 22:30:37.15610758 +0000 UTC m=+366.770264121" observedRunningTime="2026-04-20 22:30:37.333438473 +0000 UTC m=+366.947595031" watchObservedRunningTime="2026-04-20 22:30:37.335724038 +0000 UTC m=+366.949880597" Apr 20 22:30:38.311589 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:30:38.311552 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-d5hdk" Apr 20 22:31:04.570124 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.570083 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj"] Apr 20 22:31:04.572771 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.572752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.575979 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.575954 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 22:31:04.575979 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.575971 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9r985\"" Apr 20 22:31:04.577253 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.577227 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 22:31:04.583183 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.583158 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj"] Apr 20 22:31:04.678913 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.678883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.679071 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.678919 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7w5g\" (UniqueName: \"kubernetes.io/projected/f8401295-6525-4913-9416-3434209df386-kube-api-access-m7w5g\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.679071 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.678970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.779826 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.779783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.780020 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.779831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7w5g\" (UniqueName: \"kubernetes.io/projected/f8401295-6525-4913-9416-3434209df386-kube-api-access-m7w5g\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.780020 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.779879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.780202 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.780154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.780265 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.780245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.790692 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.790645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7w5g\" (UniqueName: \"kubernetes.io/projected/f8401295-6525-4913-9416-3434209df386-kube-api-access-m7w5g\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.882500 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.882461 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:04.986420 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.986388 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs"] Apr 20 22:31:04.989346 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.989327 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:04.999911 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:04.999886 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs"] Apr 20 22:31:05.020230 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.019915 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj"] Apr 20 22:31:05.022847 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:31:05.022814 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8401295_6525_4913_9416_3434209df386.slice/crio-ca16fa916295784c73a20cf5e24a41894527673a7df2d0225a910272a7976cca WatchSource:0}: Error finding container ca16fa916295784c73a20cf5e24a41894527673a7df2d0225a910272a7976cca: Status 404 returned error can't find the container with id ca16fa916295784c73a20cf5e24a41894527673a7df2d0225a910272a7976cca Apr 20 22:31:05.082110 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.082082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.082215 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.082124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.082215 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.082188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvh69\" (UniqueName: \"kubernetes.io/projected/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-kube-api-access-bvh69\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.183009 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.182907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvh69\" (UniqueName: \"kubernetes.io/projected/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-kube-api-access-bvh69\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.183165 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.183033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.183165 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.183056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.183474 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.183452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.183474 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.183463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.192476 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.192448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvh69\" (UniqueName: \"kubernetes.io/projected/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-kube-api-access-bvh69\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.301893 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.301857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:05.399795 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.399650 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8401295-6525-4913-9416-3434209df386" containerID="c03688f95880f08a408f8d86c3c9bcfc0ea771d6e9649db2195606ba6a309859" exitCode=0 Apr 20 22:31:05.399795 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.399734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" event={"ID":"f8401295-6525-4913-9416-3434209df386","Type":"ContainerDied","Data":"c03688f95880f08a408f8d86c3c9bcfc0ea771d6e9649db2195606ba6a309859"} Apr 20 22:31:05.399795 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.399766 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" event={"ID":"f8401295-6525-4913-9416-3434209df386","Type":"ContainerStarted","Data":"ca16fa916295784c73a20cf5e24a41894527673a7df2d0225a910272a7976cca"} Apr 20 22:31:05.405306 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.405278 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv"] Apr 20 22:31:05.409213 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.409192 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.421434 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.421406 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv"] Apr 20 22:31:05.444375 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.444354 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs"] Apr 20 22:31:05.446305 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:31:05.446277 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b3f5f16_16a3_4497_83c3_4e0bc9f5907e.slice/crio-6e91c08cec327e2c4ad49b8459bed2b92a5a28ccb3625d875eada2b4bbbc97fe WatchSource:0}: Error finding container 6e91c08cec327e2c4ad49b8459bed2b92a5a28ccb3625d875eada2b4bbbc97fe: Status 404 returned error can't find the container with id 6e91c08cec327e2c4ad49b8459bed2b92a5a28ccb3625d875eada2b4bbbc97fe Apr 20 22:31:05.485450 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.485263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbx7b\" (UniqueName: \"kubernetes.io/projected/366a9476-8414-4b17-b0c7-5f695a36551e-kube-api-access-dbx7b\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.485450 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.485334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.485450 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.485380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.586734 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.586664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbx7b\" (UniqueName: \"kubernetes.io/projected/366a9476-8414-4b17-b0c7-5f695a36551e-kube-api-access-dbx7b\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.587130 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.586745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.587130 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.586775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.587222 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.587160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.587222 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.587184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.599050 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.599025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbx7b\" (UniqueName: \"kubernetes.io/projected/366a9476-8414-4b17-b0c7-5f695a36551e-kube-api-access-dbx7b\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.720412 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.720314 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:05.849728 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.849103 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv"] Apr 20 22:31:05.997334 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.997259 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd"] Apr 20 22:31:05.999739 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:05.999724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.020789 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.020760 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd"] Apr 20 22:31:06.091259 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.091226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.091420 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.091278 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.091420 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.091399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25gnc\" (UniqueName: \"kubernetes.io/projected/1f3603e9-51e8-497a-8783-772b2cce919c-kube-api-access-25gnc\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.192532 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.192505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25gnc\" (UniqueName: \"kubernetes.io/projected/1f3603e9-51e8-497a-8783-772b2cce919c-kube-api-access-25gnc\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.192650 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.192547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.192650 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.192577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.192933 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.192916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.192973 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.192957 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.206778 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.206750 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25gnc\" (UniqueName: \"kubernetes.io/projected/1f3603e9-51e8-497a-8783-772b2cce919c-kube-api-access-25gnc\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.314962 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.314928 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:06.406331 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.406289 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8401295-6525-4913-9416-3434209df386" containerID="1d2d556fa020357caec280b750113cc6f29f31ae7a1291c16c4d5356cc8c1c02" exitCode=0 Apr 20 22:31:06.406479 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.406390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" event={"ID":"f8401295-6525-4913-9416-3434209df386","Type":"ContainerDied","Data":"1d2d556fa020357caec280b750113cc6f29f31ae7a1291c16c4d5356cc8c1c02"} Apr 20 22:31:06.407942 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.407907 2575 generic.go:358] "Generic (PLEG): container finished" podID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerID="5e11e87d0affecd616e37c3f1157f17be2d3e3df1c0b33d47002b53d1a0a1868" exitCode=0 Apr 20 22:31:06.408065 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.407999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" event={"ID":"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e","Type":"ContainerDied","Data":"5e11e87d0affecd616e37c3f1157f17be2d3e3df1c0b33d47002b53d1a0a1868"} Apr 20 22:31:06.408065 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.408028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" event={"ID":"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e","Type":"ContainerStarted","Data":"6e91c08cec327e2c4ad49b8459bed2b92a5a28ccb3625d875eada2b4bbbc97fe"} Apr 20 22:31:06.409918 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.409895 2575 generic.go:358] "Generic (PLEG): container finished" podID="366a9476-8414-4b17-b0c7-5f695a36551e" containerID="5731804cc81267138bcfadbcc23814231cb96ea3d929f8f019e48529ff9b5532" exitCode=0 Apr 20 22:31:06.410251 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.409970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" event={"ID":"366a9476-8414-4b17-b0c7-5f695a36551e","Type":"ContainerDied","Data":"5731804cc81267138bcfadbcc23814231cb96ea3d929f8f019e48529ff9b5532"} Apr 20 22:31:06.410251 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.409992 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" event={"ID":"366a9476-8414-4b17-b0c7-5f695a36551e","Type":"ContainerStarted","Data":"fa6596d1b78aa3ef050d0496b613f2f48d7998f3118f49e14c3a186dc11d8c1a"} Apr 20 22:31:06.449995 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:06.449969 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd"] Apr 20 22:31:06.451841 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:31:06.451812 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f3603e9_51e8_497a_8783_772b2cce919c.slice/crio-fc615eae3fa373c936f9df9a2db25448620d4aef870e35c006a173885feede09 WatchSource:0}: Error finding container fc615eae3fa373c936f9df9a2db25448620d4aef870e35c006a173885feede09: Status 404 returned error can't find the container with id fc615eae3fa373c936f9df9a2db25448620d4aef870e35c006a173885feede09 Apr 20 22:31:07.416052 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:07.416015 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8401295-6525-4913-9416-3434209df386" containerID="75477d314d45c1ee11cbeeb98255e2bf4233f809a420ac7336325fca567dd200" exitCode=0 Apr 20 22:31:07.416516 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:07.416084 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" event={"ID":"f8401295-6525-4913-9416-3434209df386","Type":"ContainerDied","Data":"75477d314d45c1ee11cbeeb98255e2bf4233f809a420ac7336325fca567dd200"} Apr 20 22:31:07.417502 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:07.417478 2575 generic.go:358] "Generic (PLEG): container finished" podID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerID="ac77c7bf6bda676aa8f1d68ebd528981b0acb6d3fd7512702f9822681a3227a4" exitCode=0 Apr 20 22:31:07.417627 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:07.417509 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" event={"ID":"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e","Type":"ContainerDied","Data":"ac77c7bf6bda676aa8f1d68ebd528981b0acb6d3fd7512702f9822681a3227a4"} Apr 20 22:31:07.419044 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:07.419024 2575 generic.go:358] "Generic (PLEG): container finished" podID="366a9476-8414-4b17-b0c7-5f695a36551e" containerID="3c1dea0fd4abee12b369baae4e83fe02c0053ffa9688387e47dfd12e16bde0ae" exitCode=0 Apr 20 22:31:07.419134 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:07.419096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" event={"ID":"366a9476-8414-4b17-b0c7-5f695a36551e","Type":"ContainerDied","Data":"3c1dea0fd4abee12b369baae4e83fe02c0053ffa9688387e47dfd12e16bde0ae"} Apr 20 22:31:07.420440 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:07.420419 2575 generic.go:358] "Generic (PLEG): container finished" podID="1f3603e9-51e8-497a-8783-772b2cce919c" containerID="663940207dde270a0033dc89cf5bea39c45bd2b65efb5c2121b858d15a77705a" exitCode=0 Apr 20 22:31:07.420545 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:07.420486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" event={"ID":"1f3603e9-51e8-497a-8783-772b2cce919c","Type":"ContainerDied","Data":"663940207dde270a0033dc89cf5bea39c45bd2b65efb5c2121b858d15a77705a"} Apr 20 22:31:07.420545 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:07.420502 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" event={"ID":"1f3603e9-51e8-497a-8783-772b2cce919c","Type":"ContainerStarted","Data":"fc615eae3fa373c936f9df9a2db25448620d4aef870e35c006a173885feede09"} Apr 20 22:31:08.425472 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.425438 2575 generic.go:358] "Generic (PLEG): container finished" podID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerID="c9fb0ab34f6f9ca8693f386ae5b1ef93c209f0a11d7f20a4c986eeacab5b6f07" exitCode=0 Apr 20 22:31:08.425945 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.425505 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" event={"ID":"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e","Type":"ContainerDied","Data":"c9fb0ab34f6f9ca8693f386ae5b1ef93c209f0a11d7f20a4c986eeacab5b6f07"} Apr 20 22:31:08.427274 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.427245 2575 generic.go:358] "Generic (PLEG): container finished" podID="366a9476-8414-4b17-b0c7-5f695a36551e" containerID="e19625b240f0b18804cc568b64753833efb07a3917a0989649f958f3a5d48896" exitCode=0 Apr 20 22:31:08.427397 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.427328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" event={"ID":"366a9476-8414-4b17-b0c7-5f695a36551e","Type":"ContainerDied","Data":"e19625b240f0b18804cc568b64753833efb07a3917a0989649f958f3a5d48896"} Apr 20 22:31:08.428833 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.428806 2575 generic.go:358] "Generic (PLEG): container finished" podID="1f3603e9-51e8-497a-8783-772b2cce919c" containerID="8d0ce37d8424d018bd3443cc524b6abb013c0eb60c740284d16184aab359badd" exitCode=0 Apr 20 22:31:08.428998 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.428846 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" event={"ID":"1f3603e9-51e8-497a-8783-772b2cce919c","Type":"ContainerDied","Data":"8d0ce37d8424d018bd3443cc524b6abb013c0eb60c740284d16184aab359badd"} Apr 20 22:31:08.560992 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.560963 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:08.717585 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.717495 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-util\") pod \"f8401295-6525-4913-9416-3434209df386\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " Apr 20 22:31:08.717585 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.717551 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7w5g\" (UniqueName: \"kubernetes.io/projected/f8401295-6525-4913-9416-3434209df386-kube-api-access-m7w5g\") pod \"f8401295-6525-4913-9416-3434209df386\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " Apr 20 22:31:08.717799 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.717604 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-bundle\") pod \"f8401295-6525-4913-9416-3434209df386\" (UID: \"f8401295-6525-4913-9416-3434209df386\") " Apr 20 22:31:08.718175 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.718148 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-bundle" (OuterVolumeSpecName: "bundle") pod "f8401295-6525-4913-9416-3434209df386" (UID: "f8401295-6525-4913-9416-3434209df386"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:31:08.720024 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.719994 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8401295-6525-4913-9416-3434209df386-kube-api-access-m7w5g" (OuterVolumeSpecName: "kube-api-access-m7w5g") pod "f8401295-6525-4913-9416-3434209df386" (UID: "f8401295-6525-4913-9416-3434209df386"). InnerVolumeSpecName "kube-api-access-m7w5g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:31:08.722361 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.722339 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-util" (OuterVolumeSpecName: "util") pod "f8401295-6525-4913-9416-3434209df386" (UID: "f8401295-6525-4913-9416-3434209df386"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:31:08.818727 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.818657 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:08.818727 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.818720 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m7w5g\" (UniqueName: \"kubernetes.io/projected/f8401295-6525-4913-9416-3434209df386-kube-api-access-m7w5g\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:08.818727 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:08.818731 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8401295-6525-4913-9416-3434209df386-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:09.434764 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.434716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" event={"ID":"f8401295-6525-4913-9416-3434209df386","Type":"ContainerDied","Data":"ca16fa916295784c73a20cf5e24a41894527673a7df2d0225a910272a7976cca"} Apr 20 22:31:09.434764 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.434765 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca16fa916295784c73a20cf5e24a41894527673a7df2d0225a910272a7976cca" Apr 20 22:31:09.435252 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.434734 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj" Apr 20 22:31:09.436803 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.436769 2575 generic.go:358] "Generic (PLEG): container finished" podID="1f3603e9-51e8-497a-8783-772b2cce919c" containerID="a2e9b6a7e259552cb3ebf0caf0fda299c891a393f24f48f3aae9c654dede1f62" exitCode=0 Apr 20 22:31:09.437038 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.437000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" event={"ID":"1f3603e9-51e8-497a-8783-772b2cce919c","Type":"ContainerDied","Data":"a2e9b6a7e259552cb3ebf0caf0fda299c891a393f24f48f3aae9c654dede1f62"} Apr 20 22:31:09.587004 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.586979 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:09.590068 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.590044 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:09.726448 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.726358 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-util\") pod \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " Apr 20 22:31:09.726448 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.726406 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-util\") pod \"366a9476-8414-4b17-b0c7-5f695a36551e\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " Apr 20 22:31:09.726448 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.726439 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-bundle\") pod \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " Apr 20 22:31:09.726732 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.726487 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbx7b\" (UniqueName: \"kubernetes.io/projected/366a9476-8414-4b17-b0c7-5f695a36551e-kube-api-access-dbx7b\") pod \"366a9476-8414-4b17-b0c7-5f695a36551e\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " Apr 20 22:31:09.726732 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.726520 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvh69\" (UniqueName: \"kubernetes.io/projected/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-kube-api-access-bvh69\") pod \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\" (UID: \"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e\") " Apr 20 22:31:09.726732 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.726535 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-bundle\") pod \"366a9476-8414-4b17-b0c7-5f695a36551e\" (UID: \"366a9476-8414-4b17-b0c7-5f695a36551e\") " Apr 20 22:31:09.727192 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.727156 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-bundle" (OuterVolumeSpecName: "bundle") pod "366a9476-8414-4b17-b0c7-5f695a36551e" (UID: "366a9476-8414-4b17-b0c7-5f695a36551e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:31:09.727453 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.727207 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-bundle" (OuterVolumeSpecName: "bundle") pod "9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" (UID: "9b3f5f16-16a3-4497-83c3-4e0bc9f5907e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:31:09.728905 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.728876 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366a9476-8414-4b17-b0c7-5f695a36551e-kube-api-access-dbx7b" (OuterVolumeSpecName: "kube-api-access-dbx7b") pod "366a9476-8414-4b17-b0c7-5f695a36551e" (UID: "366a9476-8414-4b17-b0c7-5f695a36551e"). InnerVolumeSpecName "kube-api-access-dbx7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:31:09.729018 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.728970 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-kube-api-access-bvh69" (OuterVolumeSpecName: "kube-api-access-bvh69") pod "9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" (UID: "9b3f5f16-16a3-4497-83c3-4e0bc9f5907e"). InnerVolumeSpecName "kube-api-access-bvh69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:31:09.734771 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.734743 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-util" (OuterVolumeSpecName: "util") pod "9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" (UID: "9b3f5f16-16a3-4497-83c3-4e0bc9f5907e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:31:09.735300 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.735283 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-util" (OuterVolumeSpecName: "util") pod "366a9476-8414-4b17-b0c7-5f695a36551e" (UID: "366a9476-8414-4b17-b0c7-5f695a36551e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:31:09.827775 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.827733 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bvh69\" (UniqueName: \"kubernetes.io/projected/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-kube-api-access-bvh69\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:09.827775 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.827767 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:09.827775 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.827778 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:09.827775 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.827785 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366a9476-8414-4b17-b0c7-5f695a36551e-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:09.828038 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.827793 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b3f5f16-16a3-4497-83c3-4e0bc9f5907e-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:09.828038 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:09.827802 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dbx7b\" (UniqueName: \"kubernetes.io/projected/366a9476-8414-4b17-b0c7-5f695a36551e-kube-api-access-dbx7b\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:10.442745 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.442710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" event={"ID":"9b3f5f16-16a3-4497-83c3-4e0bc9f5907e","Type":"ContainerDied","Data":"6e91c08cec327e2c4ad49b8459bed2b92a5a28ccb3625d875eada2b4bbbc97fe"} Apr 20 22:31:10.442745 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.442749 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e91c08cec327e2c4ad49b8459bed2b92a5a28ccb3625d875eada2b4bbbc97fe" Apr 20 22:31:10.443257 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.442762 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs" Apr 20 22:31:10.444444 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.444414 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" Apr 20 22:31:10.444444 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.444419 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv" event={"ID":"366a9476-8414-4b17-b0c7-5f695a36551e","Type":"ContainerDied","Data":"fa6596d1b78aa3ef050d0496b613f2f48d7998f3118f49e14c3a186dc11d8c1a"} Apr 20 22:31:10.444592 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.444463 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6596d1b78aa3ef050d0496b613f2f48d7998f3118f49e14c3a186dc11d8c1a" Apr 20 22:31:10.569218 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.569196 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:10.734705 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.734591 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-bundle\") pod \"1f3603e9-51e8-497a-8783-772b2cce919c\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " Apr 20 22:31:10.734865 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.734731 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-util\") pod \"1f3603e9-51e8-497a-8783-772b2cce919c\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " Apr 20 22:31:10.734865 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.734770 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25gnc\" (UniqueName: \"kubernetes.io/projected/1f3603e9-51e8-497a-8783-772b2cce919c-kube-api-access-25gnc\") pod \"1f3603e9-51e8-497a-8783-772b2cce919c\" (UID: \"1f3603e9-51e8-497a-8783-772b2cce919c\") " Apr 20 22:31:10.735167 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.735140 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-bundle" (OuterVolumeSpecName: "bundle") pod "1f3603e9-51e8-497a-8783-772b2cce919c" (UID: "1f3603e9-51e8-497a-8783-772b2cce919c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:31:10.737022 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.736995 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f3603e9-51e8-497a-8783-772b2cce919c-kube-api-access-25gnc" (OuterVolumeSpecName: "kube-api-access-25gnc") pod "1f3603e9-51e8-497a-8783-772b2cce919c" (UID: "1f3603e9-51e8-497a-8783-772b2cce919c"). InnerVolumeSpecName "kube-api-access-25gnc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:31:10.740180 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.740161 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-util" (OuterVolumeSpecName: "util") pod "1f3603e9-51e8-497a-8783-772b2cce919c" (UID: "1f3603e9-51e8-497a-8783-772b2cce919c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:31:10.835423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.835368 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:10.835423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.835415 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f3603e9-51e8-497a-8783-772b2cce919c-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:10.835423 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:10.835426 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-25gnc\" (UniqueName: \"kubernetes.io/projected/1f3603e9-51e8-497a-8783-772b2cce919c-kube-api-access-25gnc\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:11.449320 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:11.449290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" event={"ID":"1f3603e9-51e8-497a-8783-772b2cce919c","Type":"ContainerDied","Data":"fc615eae3fa373c936f9df9a2db25448620d4aef870e35c006a173885feede09"} Apr 20 22:31:11.449320 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:11.449317 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd" Apr 20 22:31:11.449320 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:11.449327 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc615eae3fa373c936f9df9a2db25448620d4aef870e35c006a173885feede09" Apr 20 22:31:21.861145 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:21.861113 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6774bd9776-8g8db"] Apr 20 22:31:46.879924 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:46.879880 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6774bd9776-8g8db" podUID="42e6d741-1c79-4873-b919-bdd854703c6f" containerName="console" containerID="cri-o://b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b" gracePeriod=15 Apr 20 22:31:47.122116 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.122092 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6774bd9776-8g8db_42e6d741-1c79-4873-b919-bdd854703c6f/console/0.log" Apr 20 22:31:47.122225 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.122158 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:31:47.222485 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.222404 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-oauth-serving-cert\") pod \"42e6d741-1c79-4873-b919-bdd854703c6f\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " Apr 20 22:31:47.222485 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.222442 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-oauth-config\") pod \"42e6d741-1c79-4873-b919-bdd854703c6f\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " Apr 20 22:31:47.222485 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.222466 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-trusted-ca-bundle\") pod \"42e6d741-1c79-4873-b919-bdd854703c6f\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " Apr 20 22:31:47.222485 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.222482 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-service-ca\") pod \"42e6d741-1c79-4873-b919-bdd854703c6f\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " Apr 20 22:31:47.222850 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.222529 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8526\" (UniqueName: \"kubernetes.io/projected/42e6d741-1c79-4873-b919-bdd854703c6f-kube-api-access-s8526\") pod \"42e6d741-1c79-4873-b919-bdd854703c6f\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " Apr 20 22:31:47.222850 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.222545 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-console-config\") pod \"42e6d741-1c79-4873-b919-bdd854703c6f\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " Apr 20 22:31:47.222850 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.222577 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-serving-cert\") pod \"42e6d741-1c79-4873-b919-bdd854703c6f\" (UID: \"42e6d741-1c79-4873-b919-bdd854703c6f\") " Apr 20 22:31:47.223006 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.222964 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "42e6d741-1c79-4873-b919-bdd854703c6f" (UID: "42e6d741-1c79-4873-b919-bdd854703c6f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:31:47.223006 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.222866 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "42e6d741-1c79-4873-b919-bdd854703c6f" (UID: "42e6d741-1c79-4873-b919-bdd854703c6f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:31:47.223107 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.223049 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-console-config" (OuterVolumeSpecName: "console-config") pod "42e6d741-1c79-4873-b919-bdd854703c6f" (UID: "42e6d741-1c79-4873-b919-bdd854703c6f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:31:47.223107 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.223058 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-service-ca" (OuterVolumeSpecName: "service-ca") pod "42e6d741-1c79-4873-b919-bdd854703c6f" (UID: "42e6d741-1c79-4873-b919-bdd854703c6f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:31:47.224810 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.224789 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "42e6d741-1c79-4873-b919-bdd854703c6f" (UID: "42e6d741-1c79-4873-b919-bdd854703c6f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:31:47.224889 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.224829 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "42e6d741-1c79-4873-b919-bdd854703c6f" (UID: "42e6d741-1c79-4873-b919-bdd854703c6f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:31:47.224957 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.224939 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e6d741-1c79-4873-b919-bdd854703c6f-kube-api-access-s8526" (OuterVolumeSpecName: "kube-api-access-s8526") pod "42e6d741-1c79-4873-b919-bdd854703c6f" (UID: "42e6d741-1c79-4873-b919-bdd854703c6f"). InnerVolumeSpecName "kube-api-access-s8526". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:31:47.323443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.323405 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s8526\" (UniqueName: \"kubernetes.io/projected/42e6d741-1c79-4873-b919-bdd854703c6f-kube-api-access-s8526\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:47.323443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.323437 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-console-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:47.323443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.323446 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-serving-cert\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:47.323443 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.323455 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-oauth-serving-cert\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:47.323744 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.323464 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e6d741-1c79-4873-b919-bdd854703c6f-console-oauth-config\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:47.323744 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.323473 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-trusted-ca-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:47.323744 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.323483 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e6d741-1c79-4873-b919-bdd854703c6f-service-ca\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:31:47.582380 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.582301 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6774bd9776-8g8db_42e6d741-1c79-4873-b919-bdd854703c6f/console/0.log" Apr 20 22:31:47.582380 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.582345 2575 generic.go:358] "Generic (PLEG): container finished" podID="42e6d741-1c79-4873-b919-bdd854703c6f" containerID="b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b" exitCode=2 Apr 20 22:31:47.582558 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.582418 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6774bd9776-8g8db" Apr 20 22:31:47.582558 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.582432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6774bd9776-8g8db" event={"ID":"42e6d741-1c79-4873-b919-bdd854703c6f","Type":"ContainerDied","Data":"b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b"} Apr 20 22:31:47.582558 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.582472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6774bd9776-8g8db" event={"ID":"42e6d741-1c79-4873-b919-bdd854703c6f","Type":"ContainerDied","Data":"3ade62012559d6a4b4e9d2287e33a71402f299d4b08daf1de1e7bd0c5ad24780"} Apr 20 22:31:47.582558 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.582487 2575 scope.go:117] "RemoveContainer" containerID="b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b" Apr 20 22:31:47.591595 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.591577 2575 scope.go:117] "RemoveContainer" containerID="b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b" Apr 20 22:31:47.591889 ip-10-0-132-177 kubenswrapper[2575]: E0420 22:31:47.591867 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b\": container with ID starting with b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b not found: ID does not exist" containerID="b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b" Apr 20 22:31:47.591965 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.591897 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b"} err="failed to get container status \"b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b\": rpc error: code = NotFound desc = could not find container \"b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b\": container with ID starting with b15c25559fc2d38ce1616ef5c4643ed30f800ea682041ffac8f4e49e23362b2b not found: ID does not exist" Apr 20 22:31:47.608446 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.608413 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6774bd9776-8g8db"] Apr 20 22:31:47.615646 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:47.615620 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6774bd9776-8g8db"] Apr 20 22:31:49.013345 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:31:49.013310 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e6d741-1c79-4873-b919-bdd854703c6f" path="/var/lib/kubelet/pods/42e6d741-1c79-4873-b919-bdd854703c6f/volumes" Apr 20 22:32:20.029834 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.029798 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2"] Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030143 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerName="util" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030160 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerName="util" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030181 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e6d741-1c79-4873-b919-bdd854703c6f" containerName="console" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030187 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e6d741-1c79-4873-b919-bdd854703c6f" containerName="console" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030196 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8401295-6525-4913-9416-3434209df386" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030201 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8401295-6525-4913-9416-3434209df386" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030208 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030213 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030221 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8401295-6525-4913-9416-3434209df386" containerName="util" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030226 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8401295-6525-4913-9416-3434209df386" containerName="util" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030231 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f3603e9-51e8-497a-8783-772b2cce919c" containerName="util" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030237 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3603e9-51e8-497a-8783-772b2cce919c" containerName="util" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030244 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f3603e9-51e8-497a-8783-772b2cce919c" containerName="pull" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030249 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3603e9-51e8-497a-8783-772b2cce919c" containerName="pull" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030255 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8401295-6525-4913-9416-3434209df386" containerName="pull" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030259 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8401295-6525-4913-9416-3434209df386" containerName="pull" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030264 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="366a9476-8414-4b17-b0c7-5f695a36551e" containerName="pull" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030268 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="366a9476-8414-4b17-b0c7-5f695a36551e" containerName="pull" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030276 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerName="pull" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030281 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerName="pull" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030287 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f3603e9-51e8-497a-8783-772b2cce919c" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030292 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3603e9-51e8-497a-8783-772b2cce919c" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030300 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="366a9476-8414-4b17-b0c7-5f695a36551e" containerName="util" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030304 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="366a9476-8414-4b17-b0c7-5f695a36551e" containerName="util" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030312 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="366a9476-8414-4b17-b0c7-5f695a36551e" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030316 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="366a9476-8414-4b17-b0c7-5f695a36551e" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030361 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e6d741-1c79-4873-b919-bdd854703c6f" containerName="console" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030369 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="366a9476-8414-4b17-b0c7-5f695a36551e" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030376 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8401295-6525-4913-9416-3434209df386" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030383 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f3603e9-51e8-497a-8783-772b2cce919c" containerName="extract" Apr 20 22:32:20.030384 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.030389 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b3f5f16-16a3-4497-83c3-4e0bc9f5907e" containerName="extract" Apr 20 22:32:20.033514 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.033470 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.036102 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.036079 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-bs6c7\"" Apr 20 22:32:20.046302 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.046273 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2"] Apr 20 22:32:20.211169 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.211125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.211340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.211183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.211340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.211236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.211340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.211263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.211340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.211290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.211340 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.211332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.211509 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.211358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm67n\" (UniqueName: \"kubernetes.io/projected/8622ab1c-5bef-48b8-9d17-d3844fafac5c-kube-api-access-jm67n\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.211509 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.211379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.211509 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.211409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.312434 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.312434 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm67n\" (UniqueName: \"kubernetes.io/projected/8622ab1c-5bef-48b8-9d17-d3844fafac5c-kube-api-access-jm67n\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.312434 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.312434 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.312767 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.312767 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.312767 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312534 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.312767 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.312767 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.313017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.312981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.313061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.313013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.313061 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.313029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.313126 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.313084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.313126 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.313118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.315076 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.315049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.315346 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.315330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.321663 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.321640 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8622ab1c-5bef-48b8-9d17-d3844fafac5c-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.321842 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.321822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm67n\" (UniqueName: \"kubernetes.io/projected/8622ab1c-5bef-48b8-9d17-d3844fafac5c-kube-api-access-jm67n\") pod \"maas-default-gateway-openshift-default-58b6f876-77bs2\" (UID: \"8622ab1c-5bef-48b8-9d17-d3844fafac5c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.346908 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.346876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:20.477156 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.477129 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2"] Apr 20 22:32:20.479252 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:32:20.479216 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8622ab1c_5bef_48b8_9d17_d3844fafac5c.slice/crio-7bb08e19f0376f1fc04bb83c3efd68db7213b6aca47f656933760494b61dbcf9 WatchSource:0}: Error finding container 7bb08e19f0376f1fc04bb83c3efd68db7213b6aca47f656933760494b61dbcf9: Status 404 returned error can't find the container with id 7bb08e19f0376f1fc04bb83c3efd68db7213b6aca47f656933760494b61dbcf9 Apr 20 22:32:20.705141 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:20.705103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" event={"ID":"8622ab1c-5bef-48b8-9d17-d3844fafac5c","Type":"ContainerStarted","Data":"7bb08e19f0376f1fc04bb83c3efd68db7213b6aca47f656933760494b61dbcf9"} Apr 20 22:32:23.913117 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:23.913074 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 22:32:23.913445 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:23.913164 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 22:32:23.913445 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:23.913196 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 22:32:24.721157 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:24.721122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" event={"ID":"8622ab1c-5bef-48b8-9d17-d3844fafac5c","Type":"ContainerStarted","Data":"8adf42511794f2e92da66551000917f1d021d7c702c12ddac1f3f07c56d80459"} Apr 20 22:32:24.742717 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:24.742648 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" podStartSLOduration=1.311038521 podStartE2EDuration="4.74263258s" podCreationTimestamp="2026-04-20 22:32:20 +0000 UTC" firstStartedPulling="2026-04-20 22:32:20.481174629 +0000 UTC m=+470.095331170" lastFinishedPulling="2026-04-20 22:32:23.912768692 +0000 UTC m=+473.526925229" observedRunningTime="2026-04-20 22:32:24.740429217 +0000 UTC m=+474.354585776" watchObservedRunningTime="2026-04-20 22:32:24.74263258 +0000 UTC m=+474.356789263" Apr 20 22:32:25.347616 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.347573 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:25.352652 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.352626 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:25.537854 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.537817 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xzpxs"] Apr 20 22:32:25.541189 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.541168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:25.543758 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.543733 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 22:32:25.543758 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.543748 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 22:32:25.543928 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.543749 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9r985\"" Apr 20 22:32:25.544017 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.544002 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 22:32:25.549183 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.549151 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xzpxs"] Apr 20 22:32:25.579397 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.579359 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xzpxs"] Apr 20 22:32:25.660085 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.660048 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c2a34109-db32-4387-a867-18047121c592-config-file\") pod \"limitador-limitador-78c99df468-xzpxs\" (UID: \"c2a34109-db32-4387-a867-18047121c592\") " pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:25.660265 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.660095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m9bm\" (UniqueName: \"kubernetes.io/projected/c2a34109-db32-4387-a867-18047121c592-kube-api-access-6m9bm\") pod \"limitador-limitador-78c99df468-xzpxs\" (UID: \"c2a34109-db32-4387-a867-18047121c592\") " pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:25.724580 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.724553 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:25.725651 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.725628 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-77bs2" Apr 20 22:32:25.761026 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.760994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m9bm\" (UniqueName: \"kubernetes.io/projected/c2a34109-db32-4387-a867-18047121c592-kube-api-access-6m9bm\") pod \"limitador-limitador-78c99df468-xzpxs\" (UID: \"c2a34109-db32-4387-a867-18047121c592\") " pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:25.761196 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.761103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c2a34109-db32-4387-a867-18047121c592-config-file\") pod \"limitador-limitador-78c99df468-xzpxs\" (UID: \"c2a34109-db32-4387-a867-18047121c592\") " pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:25.761715 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.761694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c2a34109-db32-4387-a867-18047121c592-config-file\") pod \"limitador-limitador-78c99df468-xzpxs\" (UID: \"c2a34109-db32-4387-a867-18047121c592\") " pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:25.772426 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.772386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m9bm\" (UniqueName: \"kubernetes.io/projected/c2a34109-db32-4387-a867-18047121c592-kube-api-access-6m9bm\") pod \"limitador-limitador-78c99df468-xzpxs\" (UID: \"c2a34109-db32-4387-a867-18047121c592\") " pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:25.852471 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.852430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:25.979982 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:25.979953 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xzpxs"] Apr 20 22:32:25.982013 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:32:25.981973 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a34109_db32_4387_a867_18047121c592.slice/crio-f802c1a2b5e0ebb81e03bdc2025f4bb91ba8c43a9b673a3a2a0b95d582922ae5 WatchSource:0}: Error finding container f802c1a2b5e0ebb81e03bdc2025f4bb91ba8c43a9b673a3a2a0b95d582922ae5: Status 404 returned error can't find the container with id f802c1a2b5e0ebb81e03bdc2025f4bb91ba8c43a9b673a3a2a0b95d582922ae5 Apr 20 22:32:26.729071 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:26.729033 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" event={"ID":"c2a34109-db32-4387-a867-18047121c592","Type":"ContainerStarted","Data":"f802c1a2b5e0ebb81e03bdc2025f4bb91ba8c43a9b673a3a2a0b95d582922ae5"} Apr 20 22:32:28.738881 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:28.738777 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" event={"ID":"c2a34109-db32-4387-a867-18047121c592","Type":"ContainerStarted","Data":"399932d12ad014f3a568dffc1b113f2395e5dfb81c33b110f9d7e4874b80cb3d"} Apr 20 22:32:28.738881 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:28.738854 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:28.757556 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:28.757498 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" podStartSLOduration=1.263529932 podStartE2EDuration="3.757477219s" podCreationTimestamp="2026-04-20 22:32:25 +0000 UTC" firstStartedPulling="2026-04-20 22:32:25.983945345 +0000 UTC m=+475.598101896" lastFinishedPulling="2026-04-20 22:32:28.477892627 +0000 UTC m=+478.092049183" observedRunningTime="2026-04-20 22:32:28.75547978 +0000 UTC m=+478.369636340" watchObservedRunningTime="2026-04-20 22:32:28.757477219 +0000 UTC m=+478.371633778" Apr 20 22:32:39.744482 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:39.744453 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-xzpxs" Apr 20 22:32:54.321520 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.321473 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6"] Apr 20 22:32:54.325070 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.325044 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.327891 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.327868 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 22:32:54.328025 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.327980 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 22:32:54.329030 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.329004 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4psch\"" Apr 20 22:32:54.333449 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.333421 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6"] Apr 20 22:32:54.371000 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.370956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.371000 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.371004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cndb\" (UniqueName: \"kubernetes.io/projected/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-kube-api-access-4cndb\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.371236 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.371043 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.472524 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.472476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.472524 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.472528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cndb\" (UniqueName: \"kubernetes.io/projected/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-kube-api-access-4cndb\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.472764 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.472566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.472903 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.472883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.472942 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.472910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.481843 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.481814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cndb\" (UniqueName: \"kubernetes.io/projected/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-kube-api-access-4cndb\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.635831 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.635796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:54.771967 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.771940 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6"] Apr 20 22:32:54.774089 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:32:54.774049 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb696b58d_2cb1_4dcb_aaa2_bb278be1ebb2.slice/crio-6fbb7fa556adb8e67a685616d272d829dbcb9de6dfbd6e54f3d60a66d5033a51 WatchSource:0}: Error finding container 6fbb7fa556adb8e67a685616d272d829dbcb9de6dfbd6e54f3d60a66d5033a51: Status 404 returned error can't find the container with id 6fbb7fa556adb8e67a685616d272d829dbcb9de6dfbd6e54f3d60a66d5033a51 Apr 20 22:32:54.833477 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:54.833447 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" event={"ID":"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2","Type":"ContainerStarted","Data":"6fbb7fa556adb8e67a685616d272d829dbcb9de6dfbd6e54f3d60a66d5033a51"} Apr 20 22:32:55.838127 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:55.838044 2575 generic.go:358] "Generic (PLEG): container finished" podID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerID="8d81dff93b46d3d95446282d72f884f67bd2e74254ca77a880d9c8e28ee9ca42" exitCode=0 Apr 20 22:32:55.838477 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:55.838135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" event={"ID":"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2","Type":"ContainerDied","Data":"8d81dff93b46d3d95446282d72f884f67bd2e74254ca77a880d9c8e28ee9ca42"} Apr 20 22:32:56.843917 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:56.843824 2575 generic.go:358] "Generic (PLEG): container finished" podID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerID="ac39619d412a07d311c13ebe83c5c6476aea10d2d071e8679bdc62e9c89beab9" exitCode=0 Apr 20 22:32:56.843917 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:56.843897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" event={"ID":"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2","Type":"ContainerDied","Data":"ac39619d412a07d311c13ebe83c5c6476aea10d2d071e8679bdc62e9c89beab9"} Apr 20 22:32:57.850083 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:57.850049 2575 generic.go:358] "Generic (PLEG): container finished" podID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerID="ff66f5dcf70b92638fed85293b6252a08dd7f3c465e80938b0c83819dafc3695" exitCode=0 Apr 20 22:32:57.850503 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:57.850141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" event={"ID":"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2","Type":"ContainerDied","Data":"ff66f5dcf70b92638fed85293b6252a08dd7f3c465e80938b0c83819dafc3695"} Apr 20 22:32:58.980262 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:58.980232 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:59.013960 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.013933 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-bundle\") pod \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " Apr 20 22:32:59.014148 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.013986 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-util\") pod \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " Apr 20 22:32:59.014148 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.014048 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cndb\" (UniqueName: \"kubernetes.io/projected/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-kube-api-access-4cndb\") pod \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\" (UID: \"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2\") " Apr 20 22:32:59.014536 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.014489 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-bundle" (OuterVolumeSpecName: "bundle") pod "b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" (UID: "b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:32:59.016406 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.016373 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-kube-api-access-4cndb" (OuterVolumeSpecName: "kube-api-access-4cndb") pod "b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" (UID: "b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2"). InnerVolumeSpecName "kube-api-access-4cndb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:32:59.019743 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.019700 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-util" (OuterVolumeSpecName: "util") pod "b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" (UID: "b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:32:59.115026 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.114922 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-bundle\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:32:59.115026 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.114968 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-util\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:32:59.115026 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.114978 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4cndb\" (UniqueName: \"kubernetes.io/projected/b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2-kube-api-access-4cndb\") on node \"ip-10-0-132-177.ec2.internal\" DevicePath \"\"" Apr 20 22:32:59.859748 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.859715 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" Apr 20 22:32:59.859748 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.859715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505nlf6" event={"ID":"b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2","Type":"ContainerDied","Data":"6fbb7fa556adb8e67a685616d272d829dbcb9de6dfbd6e54f3d60a66d5033a51"} Apr 20 22:32:59.859954 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:32:59.859764 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fbb7fa556adb8e67a685616d272d829dbcb9de6dfbd6e54f3d60a66d5033a51" Apr 20 22:33:57.097841 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:33:57.097805 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xzpxs"] Apr 20 22:34:23.536586 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:23.536552 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-cbn82_b1739d41-72c6-4c98-9382-37a93d872743/manager/2.log" Apr 20 22:34:23.654758 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:23.654721 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d8d569d47-4c2vf_8fe52c31-d1e3-4af9-92c0-4bae8f481ac3/manager/0.log" Apr 20 22:34:24.778301 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:24.778269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd_1f3603e9-51e8-497a-8783-772b2cce919c/util/0.log" Apr 20 22:34:24.784312 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:24.784282 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd_1f3603e9-51e8-497a-8783-772b2cce919c/pull/0.log" Apr 20 22:34:24.789808 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:24.789775 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd_1f3603e9-51e8-497a-8783-772b2cce919c/extract/0.log" Apr 20 22:34:24.899952 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:24.899914 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv_366a9476-8414-4b17-b0c7-5f695a36551e/util/0.log" Apr 20 22:34:24.906816 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:24.906789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv_366a9476-8414-4b17-b0c7-5f695a36551e/pull/0.log" Apr 20 22:34:24.913496 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:24.913479 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv_366a9476-8414-4b17-b0c7-5f695a36551e/extract/0.log" Apr 20 22:34:25.033863 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:25.033774 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs_9b3f5f16-16a3-4497-83c3-4e0bc9f5907e/util/0.log" Apr 20 22:34:25.040688 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:25.040652 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs_9b3f5f16-16a3-4497-83c3-4e0bc9f5907e/pull/0.log" Apr 20 22:34:25.047322 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:25.047295 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs_9b3f5f16-16a3-4497-83c3-4e0bc9f5907e/extract/0.log" Apr 20 22:34:25.154409 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:25.154372 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj_f8401295-6525-4913-9416-3434209df386/util/0.log" Apr 20 22:34:25.164578 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:25.164550 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj_f8401295-6525-4913-9416-3434209df386/pull/0.log" Apr 20 22:34:25.171023 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:25.170995 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj_f8401295-6525-4913-9416-3434209df386/extract/0.log" Apr 20 22:34:25.966645 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:25.966615 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-xzpxs_c2a34109-db32-4387-a867-18047121c592/limitador/0.log" Apr 20 22:34:26.572490 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:26.572450 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-d5hdk_39218be9-64cc-4a56-b915-5c794ceaace0/discovery/0.log" Apr 20 22:34:26.904979 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:26.904952 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-77bs2_8622ab1c-5bef-48b8-9d17-d3844fafac5c/istio-proxy/0.log" Apr 20 22:34:30.917730 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:30.917702 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:34:30.919457 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:30.919435 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:34:31.182374 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.182298 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5wqp7/must-gather-mv6nx"] Apr 20 22:34:31.184425 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.182691 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerName="pull" Apr 20 22:34:31.184425 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.182708 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerName="pull" Apr 20 22:34:31.184425 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.182716 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerName="extract" Apr 20 22:34:31.184425 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.182722 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerName="extract" Apr 20 22:34:31.184425 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.182732 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerName="util" Apr 20 22:34:31.184425 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.182741 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerName="util" Apr 20 22:34:31.184425 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.182791 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b696b58d-2cb1-4dcb-aaa2-bb278be1ebb2" containerName="extract" Apr 20 22:34:31.185487 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.185468 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wqp7/must-gather-mv6nx" Apr 20 22:34:31.188345 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.188319 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5wqp7\"/\"kube-root-ca.crt\"" Apr 20 22:34:31.188461 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.188357 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5wqp7\"/\"default-dockercfg-4qwxg\"" Apr 20 22:34:31.188461 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.188377 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5wqp7\"/\"openshift-service-ca.crt\"" Apr 20 22:34:31.195810 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.195788 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wqp7/must-gather-mv6nx"] Apr 20 22:34:31.203737 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.203711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0154526-27bf-4128-9fb8-725bc51356f1-must-gather-output\") pod \"must-gather-mv6nx\" (UID: \"a0154526-27bf-4128-9fb8-725bc51356f1\") " pod="openshift-must-gather-5wqp7/must-gather-mv6nx" Apr 20 22:34:31.203859 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.203837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6l8q\" (UniqueName: \"kubernetes.io/projected/a0154526-27bf-4128-9fb8-725bc51356f1-kube-api-access-b6l8q\") pod \"must-gather-mv6nx\" (UID: \"a0154526-27bf-4128-9fb8-725bc51356f1\") " pod="openshift-must-gather-5wqp7/must-gather-mv6nx" Apr 20 22:34:31.304245 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.304212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6l8q\" (UniqueName: \"kubernetes.io/projected/a0154526-27bf-4128-9fb8-725bc51356f1-kube-api-access-b6l8q\") pod \"must-gather-mv6nx\" (UID: \"a0154526-27bf-4128-9fb8-725bc51356f1\") " pod="openshift-must-gather-5wqp7/must-gather-mv6nx" Apr 20 22:34:31.304414 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.304259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0154526-27bf-4128-9fb8-725bc51356f1-must-gather-output\") pod \"must-gather-mv6nx\" (UID: \"a0154526-27bf-4128-9fb8-725bc51356f1\") " pod="openshift-must-gather-5wqp7/must-gather-mv6nx" Apr 20 22:34:31.304536 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.304522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0154526-27bf-4128-9fb8-725bc51356f1-must-gather-output\") pod \"must-gather-mv6nx\" (UID: \"a0154526-27bf-4128-9fb8-725bc51356f1\") " pod="openshift-must-gather-5wqp7/must-gather-mv6nx" Apr 20 22:34:31.313757 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.313733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6l8q\" (UniqueName: \"kubernetes.io/projected/a0154526-27bf-4128-9fb8-725bc51356f1-kube-api-access-b6l8q\") pod \"must-gather-mv6nx\" (UID: \"a0154526-27bf-4128-9fb8-725bc51356f1\") " pod="openshift-must-gather-5wqp7/must-gather-mv6nx" Apr 20 22:34:31.495861 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.495759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wqp7/must-gather-mv6nx" Apr 20 22:34:31.647202 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:31.647177 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wqp7/must-gather-mv6nx"] Apr 20 22:34:31.652058 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:34:31.652027 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0154526_27bf_4128_9fb8_725bc51356f1.slice/crio-7af1555b43d309cc10b74cc20b3aece45da771458f6d0d8fdc09430488a046db WatchSource:0}: Error finding container 7af1555b43d309cc10b74cc20b3aece45da771458f6d0d8fdc09430488a046db: Status 404 returned error can't find the container with id 7af1555b43d309cc10b74cc20b3aece45da771458f6d0d8fdc09430488a046db Apr 20 22:34:32.192365 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:32.192332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wqp7/must-gather-mv6nx" event={"ID":"a0154526-27bf-4128-9fb8-725bc51356f1","Type":"ContainerStarted","Data":"7af1555b43d309cc10b74cc20b3aece45da771458f6d0d8fdc09430488a046db"} Apr 20 22:34:33.199049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:33.198217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wqp7/must-gather-mv6nx" event={"ID":"a0154526-27bf-4128-9fb8-725bc51356f1","Type":"ContainerStarted","Data":"f17201396cc3c849a8d3151873217b5be139e82ed973a123fbc65016896bea1b"} Apr 20 22:34:33.199049 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:33.198263 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wqp7/must-gather-mv6nx" event={"ID":"a0154526-27bf-4128-9fb8-725bc51356f1","Type":"ContainerStarted","Data":"1a4a539956504371cc487cc0c0aee2a1022c62edec9a73b0a2ef6295782706ac"} Apr 20 22:34:33.217130 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:33.217067 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5wqp7/must-gather-mv6nx" podStartSLOduration=1.3578423960000001 podStartE2EDuration="2.217048727s" podCreationTimestamp="2026-04-20 22:34:31 +0000 UTC" firstStartedPulling="2026-04-20 22:34:31.653545417 +0000 UTC m=+601.267701961" lastFinishedPulling="2026-04-20 22:34:32.512751748 +0000 UTC m=+602.126908292" observedRunningTime="2026-04-20 22:34:33.215022948 +0000 UTC m=+602.829179512" watchObservedRunningTime="2026-04-20 22:34:33.217048727 +0000 UTC m=+602.831205288" Apr 20 22:34:34.164013 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:34.163977 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nh9q7_72dc52a4-8f63-4b8f-be9f-1e2b2cf7ab33/global-pull-secret-syncer/0.log" Apr 20 22:34:34.268626 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:34.268584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wglz9_ba8a927d-42db-4d3f-b6d1-938655219360/konnectivity-agent/0.log" Apr 20 22:34:34.307620 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:34.307588 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-177.ec2.internal_27bb6de254cf19a2989bb62d9580d525/haproxy/0.log" Apr 20 22:34:38.276857 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.276822 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd_1f3603e9-51e8-497a-8783-772b2cce919c/extract/0.log" Apr 20 22:34:38.301440 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.301405 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd_1f3603e9-51e8-497a-8783-772b2cce919c/util/0.log" Apr 20 22:34:38.327013 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.326977 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759x6hfd_1f3603e9-51e8-497a-8783-772b2cce919c/pull/0.log" Apr 20 22:34:38.354527 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.354486 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv_366a9476-8414-4b17-b0c7-5f695a36551e/extract/0.log" Apr 20 22:34:38.378636 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.378605 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv_366a9476-8414-4b17-b0c7-5f695a36551e/util/0.log" Apr 20 22:34:38.401323 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.401294 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08ckqv_366a9476-8414-4b17-b0c7-5f695a36551e/pull/0.log" Apr 20 22:34:38.429216 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.429183 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs_9b3f5f16-16a3-4497-83c3-4e0bc9f5907e/extract/0.log" Apr 20 22:34:38.453323 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.453290 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs_9b3f5f16-16a3-4497-83c3-4e0bc9f5907e/util/0.log" Apr 20 22:34:38.476515 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.476483 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736mvhs_9b3f5f16-16a3-4497-83c3-4e0bc9f5907e/pull/0.log" Apr 20 22:34:38.510688 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.510637 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj_f8401295-6525-4913-9416-3434209df386/extract/0.log" Apr 20 22:34:38.535343 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.535249 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj_f8401295-6525-4913-9416-3434209df386/util/0.log" Apr 20 22:34:38.557366 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.557328 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19l4gj_f8401295-6525-4913-9416-3434209df386/pull/0.log" Apr 20 22:34:38.746293 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:38.746262 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-xzpxs_c2a34109-db32-4387-a867-18047121c592/limitador/0.log" Apr 20 22:34:40.148663 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.148618 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/alertmanager/0.log" Apr 20 22:34:40.173357 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.173324 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/config-reloader/0.log" Apr 20 22:34:40.203371 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.203344 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/kube-rbac-proxy-web/0.log" Apr 20 22:34:40.230057 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.230029 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/kube-rbac-proxy/0.log" Apr 20 22:34:40.257797 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.257747 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/kube-rbac-proxy-metric/0.log" Apr 20 22:34:40.291517 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.291471 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/prom-label-proxy/0.log" Apr 20 22:34:40.324809 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.324727 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_54ee3a71-33df-46b8-9cf2-3ba929fdd80b/init-config-reloader/0.log" Apr 20 22:34:40.697594 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.697560 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xhqjt_eaddf140-0247-4d4a-8283-7ad9403b4507/node-exporter/0.log" Apr 20 22:34:40.722396 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.722318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xhqjt_eaddf140-0247-4d4a-8283-7ad9403b4507/kube-rbac-proxy/0.log" Apr 20 22:34:40.748797 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:40.748764 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xhqjt_eaddf140-0247-4d4a-8283-7ad9403b4507/init-textfile/0.log" Apr 20 22:34:41.088555 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:41.088525 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85b7f58c6c-p5f2f_8b85a832-b4e7-438a-bc62-1d0c115d6467/telemeter-client/0.log" Apr 20 22:34:41.111440 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:41.111408 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85b7f58c6c-p5f2f_8b85a832-b4e7-438a-bc62-1d0c115d6467/reload/0.log" Apr 20 22:34:41.136967 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:41.136934 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85b7f58c6c-p5f2f_8b85a832-b4e7-438a-bc62-1d0c115d6467/kube-rbac-proxy/0.log" Apr 20 22:34:42.765797 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.765755 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7"] Apr 20 22:34:42.771550 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.771517 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.776893 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.776864 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7"] Apr 20 22:34:42.810376 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.810334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-sys\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.810711 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.810663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-proc\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.814713 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.811613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-lib-modules\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.814713 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.811660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-podres\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.814713 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.811728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gzzt\" (UniqueName: \"kubernetes.io/projected/b5e8d04d-5ead-415c-8742-1656aa2eb6db-kube-api-access-7gzzt\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.912576 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.912535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-lib-modules\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.912864 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.912844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-podres\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.913013 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.912996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gzzt\" (UniqueName: \"kubernetes.io/projected/b5e8d04d-5ead-415c-8742-1656aa2eb6db-kube-api-access-7gzzt\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.913161 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.913144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-sys\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.913315 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.913299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-proc\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.913532 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.913518 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-proc\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.913765 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.913749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-lib-modules\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.913933 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.913920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-podres\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.914328 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.914302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5e8d04d-5ead-415c-8742-1656aa2eb6db-sys\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:42.923576 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:42.923542 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gzzt\" (UniqueName: \"kubernetes.io/projected/b5e8d04d-5ead-415c-8742-1656aa2eb6db-kube-api-access-7gzzt\") pod \"perf-node-gather-daemonset-t8gh7\" (UID: \"b5e8d04d-5ead-415c-8742-1656aa2eb6db\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:43.085732 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:43.085620 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:43.241307 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:43.241270 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7"] Apr 20 22:34:43.244539 ip-10-0-132-177 kubenswrapper[2575]: W0420 22:34:43.244501 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb5e8d04d_5ead_415c_8742_1656aa2eb6db.slice/crio-7ecb50458d7d6e87cbd4e93e09093871a926942e3c4df2bf875cd189c1a14ea6 WatchSource:0}: Error finding container 7ecb50458d7d6e87cbd4e93e09093871a926942e3c4df2bf875cd189c1a14ea6: Status 404 returned error can't find the container with id 7ecb50458d7d6e87cbd4e93e09093871a926942e3c4df2bf875cd189c1a14ea6 Apr 20 22:34:43.251605 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:43.251523 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" event={"ID":"b5e8d04d-5ead-415c-8742-1656aa2eb6db","Type":"ContainerStarted","Data":"7ecb50458d7d6e87cbd4e93e09093871a926942e3c4df2bf875cd189c1a14ea6"} Apr 20 22:34:44.257120 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:44.257089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" event={"ID":"b5e8d04d-5ead-415c-8742-1656aa2eb6db","Type":"ContainerStarted","Data":"f80f5c1bd2b86007faec3a8d252abc2ea046bb98d83b0ed50825f2ee70ad394d"} Apr 20 22:34:44.257552 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:44.257214 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:44.274534 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:44.274482 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" podStartSLOduration=2.274464925 podStartE2EDuration="2.274464925s" podCreationTimestamp="2026-04-20 22:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:34:44.272813798 +0000 UTC m=+613.886970358" watchObservedRunningTime="2026-04-20 22:34:44.274464925 +0000 UTC m=+613.888621480" Apr 20 22:34:44.763535 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:44.763508 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7bc89_238c0ea5-4742-4d5c-b685-8c4aab704f3c/dns/0.log" Apr 20 22:34:44.784917 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:44.784886 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7bc89_238c0ea5-4742-4d5c-b685-8c4aab704f3c/kube-rbac-proxy/0.log" Apr 20 22:34:44.894928 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:44.894893 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sjwlz_c9389f21-c437-4990-a923-b0ff03e3ba21/dns-node-resolver/0.log" Apr 20 22:34:45.428956 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:45.428924 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-598d4bbdbc-hc4q7_eadeda0e-6eb7-49ca-aa1c-dd1002554f51/registry/0.log" Apr 20 22:34:45.491123 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:45.491094 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zf7wp_456ba91d-0822-42ce-a041-f73b13a803c5/node-ca/0.log" Apr 20 22:34:46.365862 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:46.365826 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-d5hdk_39218be9-64cc-4a56-b915-5c794ceaace0/discovery/0.log" Apr 20 22:34:46.438056 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:46.438023 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-77bs2_8622ab1c-5bef-48b8-9d17-d3844fafac5c/istio-proxy/0.log" Apr 20 22:34:46.981896 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:46.981852 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-87rj9_c3473a30-a4b9-4d21-9b2f-83594665ed99/serve-healthcheck-canary/0.log" Apr 20 22:34:47.617979 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:47.617953 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lbljv_814cbee0-89a6-4755-8d2c-bb2ca9cb16d0/kube-rbac-proxy/0.log" Apr 20 22:34:47.639008 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:47.638980 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lbljv_814cbee0-89a6-4755-8d2c-bb2ca9cb16d0/exporter/0.log" Apr 20 22:34:47.661999 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:47.661971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lbljv_814cbee0-89a6-4755-8d2c-bb2ca9cb16d0/extractor/0.log" Apr 20 22:34:49.574610 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:49.574559 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-cbn82_b1739d41-72c6-4c98-9382-37a93d872743/manager/1.log" Apr 20 22:34:49.584350 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:49.584318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-cbn82_b1739d41-72c6-4c98-9382-37a93d872743/manager/2.log" Apr 20 22:34:49.608167 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:49.608135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d8d569d47-4c2vf_8fe52c31-d1e3-4af9-92c0-4bae8f481ac3/manager/0.log" Apr 20 22:34:50.271653 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:50.271627 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-t8gh7" Apr 20 22:34:50.915230 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:50.915205 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-845776cd66-9p9v5_4ffb2102-620d-4c58-87e1-76e2bc0cb75b/manager/0.log" Apr 20 22:34:56.866339 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:56.866289 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5m9gf_f0c17cb1-e694-4fe6-8bfb-113e266578ab/kube-multus/0.log" Apr 20 22:34:57.046445 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:57.046409 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92xbc_3e440b6a-d5a8-43fe-af3d-a999f8dce281/kube-multus-additional-cni-plugins/0.log" Apr 20 22:34:57.067833 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:57.067798 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92xbc_3e440b6a-d5a8-43fe-af3d-a999f8dce281/egress-router-binary-copy/0.log" Apr 20 22:34:57.089551 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:57.089523 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92xbc_3e440b6a-d5a8-43fe-af3d-a999f8dce281/cni-plugins/0.log" Apr 20 22:34:57.113336 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:57.113316 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92xbc_3e440b6a-d5a8-43fe-af3d-a999f8dce281/bond-cni-plugin/0.log" Apr 20 22:34:57.134195 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:57.134138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92xbc_3e440b6a-d5a8-43fe-af3d-a999f8dce281/routeoverride-cni/0.log" Apr 20 22:34:57.156558 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:57.156534 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92xbc_3e440b6a-d5a8-43fe-af3d-a999f8dce281/whereabouts-cni-bincopy/0.log" Apr 20 22:34:57.178716 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:57.178692 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92xbc_3e440b6a-d5a8-43fe-af3d-a999f8dce281/whereabouts-cni/0.log" Apr 20 22:34:57.463279 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:57.463201 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qg2mj_5add223c-497e-4cc3-863e-339b6f999506/network-metrics-daemon/0.log" Apr 20 22:34:57.482488 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:57.482443 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qg2mj_5add223c-497e-4cc3-863e-339b6f999506/kube-rbac-proxy/0.log" Apr 20 22:34:59.050589 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:59.050547 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-controller/0.log" Apr 20 22:34:59.067349 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:59.067318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/0.log" Apr 20 22:34:59.072898 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:59.072866 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovn-acl-logging/1.log" Apr 20 22:34:59.094662 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:59.094627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/kube-rbac-proxy-node/0.log" Apr 20 22:34:59.116684 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:59.116637 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 22:34:59.137574 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:59.137544 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/northd/0.log" Apr 20 22:34:59.163201 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:59.163171 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/nbdb/0.log" Apr 20 22:34:59.184251 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:59.184215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/sbdb/0.log" Apr 20 22:34:59.394243 ip-10-0-132-177 kubenswrapper[2575]: I0420 22:34:59.394205 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rp7bw_2c237e12-2748-4be2-8f88-258e6064ea33/ovnkube-controller/0.log"