Apr 21 02:41:07.108789 ip-10-0-131-170 systemd[1]: Starting Kubernetes Kubelet... Apr 21 02:41:07.574938 ip-10-0-131-170 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 02:41:07.574938 ip-10-0-131-170 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 02:41:07.574938 ip-10-0-131-170 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 02:41:07.574938 ip-10-0-131-170 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 02:41:07.574938 ip-10-0-131-170 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 02:41:07.576168 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.576096 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 02:41:07.580396 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580381 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.580396 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580395 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580400 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580403 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580407 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580410 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580413 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580416 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580419 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580429 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580432 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580435 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580438 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580441 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580444 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580447 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580450 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580452 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580455 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580458 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.580471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580461 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580464 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580466 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580470 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580472 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580475 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580478 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580481 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580484 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580486 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580511 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580514 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580517 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580520 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580522 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580530 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580533 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580535 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580538 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580540 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.580949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580543 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580545 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580548 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580550 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580553 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580557 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580560 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580563 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580565 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580568 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580572 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580575 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580578 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580582 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580585 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580588 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580590 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580593 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580595 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.581433 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580598 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580601 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580603 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580606 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580616 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580619 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580621 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580623 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580626 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580628 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580631 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580633 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580636 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580638 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580641 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580644 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580647 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580650 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580652 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580655 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.581910 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580657 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580660 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580662 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580665 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580667 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580670 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.580673 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581094 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581099 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581104 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581108 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581111 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581114 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581117 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581119 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581122 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581131 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581134 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581136 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.582411 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581139 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581142 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581144 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581147 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581150 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581152 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581155 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581157 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581160 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581162 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581165 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581168 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581171 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581174 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581176 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581179 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581182 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581184 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581187 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581190 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.582921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581192 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581195 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581197 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581200 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581202 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581205 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581208 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581210 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581212 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581215 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581224 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581227 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581229 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581232 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581234 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581237 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581239 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581242 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581244 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581247 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581249 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.583513 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581252 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581255 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581258 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581260 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581263 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581265 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581268 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581270 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581273 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581275 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581278 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581280 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581283 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581285 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581288 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581290 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581293 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581295 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581298 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.584045 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581301 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581303 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581305 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581314 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581316 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581319 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581322 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581325 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581328 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581330 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581332 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581335 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581337 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.581340 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582043 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582057 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582067 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582072 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582076 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582080 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582084 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 02:41:07.584547 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582089 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582092 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582095 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582098 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582101 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582104 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582107 2564 flags.go:64] FLAG: --cgroup-root="" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582110 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582113 2564 flags.go:64] FLAG: --client-ca-file="" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582116 2564 flags.go:64] FLAG: --cloud-config="" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582119 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582122 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582128 2564 flags.go:64] FLAG: --cluster-domain="" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582131 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582134 2564 flags.go:64] FLAG: --config-dir="" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582142 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582146 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582150 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582153 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582156 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582159 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582162 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582165 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582168 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582171 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 02:41:07.585069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582174 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582179 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582182 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582184 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582188 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582191 2564 flags.go:64] FLAG: --enable-server="true" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582194 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582201 2564 flags.go:64] FLAG: --event-burst="100" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582204 2564 flags.go:64] FLAG: --event-qps="50" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582207 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582210 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582213 2564 flags.go:64] FLAG: --eviction-hard="" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582216 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582219 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582222 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582225 2564 flags.go:64] FLAG: --eviction-soft="" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582228 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582230 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582233 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582236 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582239 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582242 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582244 2564 flags.go:64] FLAG: --feature-gates="" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582257 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582260 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 02:41:07.585719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582263 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582266 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582269 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582273 2564 flags.go:64] FLAG: --help="false" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582275 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-131-170.ec2.internal" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582279 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582282 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582284 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582288 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582292 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582295 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582298 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582300 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582303 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582306 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582309 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582312 2564 flags.go:64] FLAG: --kube-reserved="" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582315 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582317 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582320 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582323 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582326 2564 flags.go:64] FLAG: --lock-file="" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582328 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582331 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 02:41:07.586322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582334 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582339 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582342 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582345 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582347 2564 flags.go:64] FLAG: --logging-format="text" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582350 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582353 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582362 2564 flags.go:64] FLAG: --manifest-url="" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582365 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582369 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582372 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582376 2564 flags.go:64] FLAG: --max-pods="110" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582380 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582383 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582386 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582388 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582393 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582396 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582399 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582406 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582409 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582412 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582415 2564 flags.go:64] FLAG: --pod-cidr="" Apr 21 02:41:07.586919 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582418 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582425 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582428 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582431 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582434 2564 flags.go:64] FLAG: --port="10250" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582437 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582440 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b795f96425685e3b" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582443 2564 flags.go:64] FLAG: --qos-reserved="" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582446 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582449 2564 flags.go:64] FLAG: --register-node="true" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582451 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582454 2564 flags.go:64] FLAG: --register-with-taints="" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582458 2564 flags.go:64] FLAG: --registry-burst="10" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582461 2564 flags.go:64] FLAG: --registry-qps="5" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582464 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582467 2564 flags.go:64] FLAG: --reserved-memory="" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582471 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582474 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582477 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582480 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582483 2564 flags.go:64] FLAG: --runonce="false" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582486 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582489 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582492 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582505 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582509 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 02:41:07.587481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582513 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582515 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582518 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582521 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582524 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582527 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582530 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582533 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582536 2564 flags.go:64] FLAG: --system-cgroups="" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582539 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582544 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582547 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582550 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582556 2564 flags.go:64] FLAG: --tls-min-version="" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582559 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582562 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582565 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582568 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582571 2564 flags.go:64] FLAG: --v="2" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582575 2564 flags.go:64] FLAG: --version="false" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582579 2564 flags.go:64] FLAG: --vmodule="" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582583 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.582586 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582714 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582719 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.588120 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582723 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582725 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582731 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582734 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582737 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582740 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582744 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582747 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582750 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582752 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582755 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582758 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582760 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582763 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582766 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582768 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582771 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582774 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582777 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582781 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.588767 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582784 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582787 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582789 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582792 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582794 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582797 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582800 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582802 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582806 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582808 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582811 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582813 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582817 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582819 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582823 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582826 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582829 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582831 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582835 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.589627 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582838 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582841 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582843 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582846 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582848 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582851 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582853 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582856 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582859 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582861 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582864 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582866 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582869 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582872 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582874 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582877 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582879 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582882 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582884 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582887 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.590206 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582890 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582892 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582895 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582897 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582900 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582902 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582905 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582909 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582911 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582914 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582916 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582920 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582923 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582925 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582928 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582931 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582934 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582936 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582939 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582942 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.590734 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582944 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.591246 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582947 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.591246 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582949 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.591246 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582953 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.591246 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.582957 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.591246 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.583671 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 02:41:07.592207 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.592189 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 02:41:07.592241 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.592209 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 02:41:07.592275 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592256 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.592275 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592262 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.592275 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592266 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.592275 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592269 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.592275 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592272 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.592275 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592275 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592278 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592281 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592284 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592286 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592289 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592292 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592294 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592297 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592300 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592302 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592305 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592307 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592311 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592315 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592319 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592322 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592325 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592328 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.592419 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592331 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592344 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592348 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592352 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592355 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592358 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592361 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592364 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592367 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592370 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592372 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592375 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592378 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592380 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592383 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592387 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592390 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592392 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592395 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592398 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.592942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592400 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592403 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592405 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592408 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592411 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592414 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592416 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592419 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592421 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592424 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592427 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592430 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592432 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592435 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592438 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592440 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592443 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592446 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592448 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592450 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.593436 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592453 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592455 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592458 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592460 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592463 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592465 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592468 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592471 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592473 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592476 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592479 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592482 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592485 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592488 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592490 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592493 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592509 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592512 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592515 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592517 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.594030 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592520 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592523 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.592528 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592621 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592626 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592629 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592632 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592635 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592638 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592640 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592643 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592646 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592648 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592651 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592654 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592657 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.594535 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592659 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592662 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592664 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592667 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592670 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592673 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592676 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592679 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592681 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592685 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592688 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592691 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592693 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592696 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592698 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592701 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592703 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592705 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592708 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592710 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.594930 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592713 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592715 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592718 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592720 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592723 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592726 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592728 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592731 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592734 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592736 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592739 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592741 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592744 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592746 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592749 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592755 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592760 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592763 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592765 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.595425 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592768 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592771 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592774 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592777 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592780 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592783 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592785 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592788 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592790 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592793 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592796 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592798 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592800 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592803 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592805 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592808 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592810 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592812 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592815 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592817 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.595913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592820 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592822 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592825 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592828 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592830 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592833 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592835 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592838 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592841 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592844 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592846 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592849 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592852 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:07.592854 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.592859 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 02:41:07.596426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.593680 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 02:41:07.596807 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.596587 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 02:41:07.597582 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.597570 2564 server.go:1019] "Starting client certificate rotation" Apr 21 02:41:07.597679 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.597662 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 02:41:07.597711 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.597698 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 02:41:07.626277 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.626259 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 02:41:07.630792 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.630771 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 02:41:07.645677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.645578 2564 log.go:25] "Validated CRI v1 runtime API" Apr 21 02:41:07.652721 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.652700 2564 log.go:25] "Validated CRI v1 image API" Apr 21 02:41:07.654353 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.654338 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 02:41:07.662734 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.662714 2564 fs.go:135] Filesystem UUIDs: map[2b427b5a-c6a9-4391-8ac3-385bda8002c1:/dev/nvme0n1p4 42139a20-5013-45d3-837c-d1a39096c633:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 21 02:41:07.662817 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.662734 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 02:41:07.668739 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.668638 2564 manager.go:217] Machine: {Timestamp:2026-04-21 02:41:07.666628154 +0000 UTC m=+0.434116834 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102002 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2266694c3cdd8f79c4b38fe6c954da SystemUUID:ec226669-4c3c-dd8f-79c4-b38fe6c954da BootID:e23e2a62-a3f4-458e-a01a-d1f18d40ab82 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:85:8c:69:c8:2b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:85:8c:69:c8:2b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:3f:c8:22:a8:29 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 02:41:07.668739 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.668735 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 02:41:07.668847 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.668835 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 02:41:07.670005 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.669980 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 02:41:07.670131 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.670007 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-170.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 02:41:07.670177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.670140 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 02:41:07.670177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.670147 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 02:41:07.670177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.670160 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 02:41:07.670905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.670894 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 02:41:07.672291 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.672280 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 21 02:41:07.672388 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.672379 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 02:41:07.675626 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.675616 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 21 02:41:07.675667 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.675629 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 02:41:07.675667 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.675641 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 02:41:07.675667 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.675650 2564 kubelet.go:397] "Adding apiserver pod source" Apr 21 02:41:07.675667 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.675658 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 02:41:07.676648 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.676636 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 02:41:07.676697 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.676654 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 02:41:07.678180 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.678161 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 02:41:07.679907 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.679891 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 02:41:07.681594 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.681575 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 02:41:07.682775 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682761 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 02:41:07.682824 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682786 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 02:41:07.682824 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682796 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 02:41:07.682824 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682805 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 02:41:07.682824 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682817 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 02:41:07.682933 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682852 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 02:41:07.682933 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682858 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 02:41:07.682933 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682866 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 02:41:07.682933 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682874 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 02:41:07.682933 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682879 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 02:41:07.682933 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682893 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 02:41:07.682933 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.682903 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 02:41:07.683757 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.683746 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 02:41:07.683757 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.683757 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 02:41:07.687424 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.687408 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 02:41:07.687522 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.687441 2564 server.go:1295] "Started kubelet" Apr 21 02:41:07.687607 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.687584 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 02:41:07.687654 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.687579 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 02:41:07.687654 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.687641 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 02:41:07.687894 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.687876 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-170.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 02:41:07.688193 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.688171 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-170.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 02:41:07.688321 ip-10-0-131-170 systemd[1]: Started Kubernetes Kubelet. Apr 21 02:41:07.688440 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.688423 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 02:41:07.689434 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.689419 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 02:41:07.694476 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.694456 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 21 02:41:07.696766 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.696751 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 02:41:07.696851 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.696812 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 02:41:07.697393 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.697373 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 02:41:07.697393 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.697391 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 02:41:07.697543 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.697419 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 02:41:07.697543 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.697484 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 21 02:41:07.697543 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.697493 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 21 02:41:07.697796 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.697665 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:07.697796 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.696709 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-170.ec2.internal.18a83ef34cff0ec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-170.ec2.internal,UID:ip-10-0-131-170.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-170.ec2.internal,},FirstTimestamp:2026-04-21 02:41:07.687419592 +0000 UTC m=+0.454908272,LastTimestamp:2026-04-21 02:41:07.687419592 +0000 UTC m=+0.454908272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-170.ec2.internal,}" Apr 21 02:41:07.699277 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.699255 2564 factory.go:55] Registering systemd factory Apr 21 02:41:07.699384 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.699373 2564 factory.go:223] Registration of the systemd container factory successfully Apr 21 02:41:07.699686 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.699671 2564 factory.go:153] Registering CRI-O factory Apr 21 02:41:07.699686 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.699687 2564 factory.go:223] Registration of the crio container factory successfully Apr 21 02:41:07.699900 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.699780 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 02:41:07.699900 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.699804 2564 factory.go:103] Registering Raw factory Apr 21 02:41:07.699900 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.699825 2564 manager.go:1196] Started watching for new ooms in manager Apr 21 02:41:07.700612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.700202 2564 manager.go:319] Starting recovery of all containers Apr 21 02:41:07.703510 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.703473 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5rpd2" Apr 21 02:41:07.706934 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.706907 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 02:41:07.707032 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.707019 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-170.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 02:41:07.710115 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.710086 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5rpd2" Apr 21 02:41:07.711452 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.711438 2564 manager.go:324] Recovery completed Apr 21 02:41:07.715229 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.715217 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.717675 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.717643 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.717735 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.717688 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.717735 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.717703 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.718168 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.718156 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 02:41:07.718208 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.718169 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 02:41:07.718208 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.718188 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 21 02:41:07.720115 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.720104 2564 policy_none.go:49] "None policy: Start" Apr 21 02:41:07.720154 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.720119 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 02:41:07.720154 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.720128 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 21 02:41:07.757041 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.757021 2564 manager.go:341] "Starting Device Plugin manager" Apr 21 02:41:07.765568 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.757063 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 02:41:07.765568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.757076 2564 server.go:85] "Starting device plugin registration server" Apr 21 02:41:07.765568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.757286 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 02:41:07.765568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.757300 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 02:41:07.765568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.757387 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 02:41:07.765568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.757465 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 02:41:07.765568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.757474 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 02:41:07.765568 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.757844 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 02:41:07.765568 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.757871 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:07.838916 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.838868 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 02:41:07.840002 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.839984 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 02:41:07.840088 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.840017 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 02:41:07.840088 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.840034 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 02:41:07.840088 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.840040 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 02:41:07.840088 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.840069 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 02:41:07.843194 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.843175 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:07.858187 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.858164 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.859018 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.859003 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.859083 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.859031 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.859083 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.859042 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.859083 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.859060 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-170.ec2.internal" Apr 21 02:41:07.866615 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.866600 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-170.ec2.internal" Apr 21 02:41:07.866700 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.866623 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-170.ec2.internal\": node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:07.885638 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.885617 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:07.940282 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.940262 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal"] Apr 21 02:41:07.940355 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.940331 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.941018 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.941005 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.941089 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.941034 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.941089 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.941047 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.942098 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.942085 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.942256 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.942237 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" Apr 21 02:41:07.942322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.942276 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.942819 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.942803 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.942910 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.942833 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.942910 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.942847 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.942910 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.942804 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.942910 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.942890 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.942910 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.942902 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.943904 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.943889 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal" Apr 21 02:41:07.943972 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.943911 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.944534 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.944519 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.944610 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.944543 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.944610 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.944554 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.965067 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.965045 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-170.ec2.internal\" not found" node="ip-10-0-131-170.ec2.internal" Apr 21 02:41:07.969329 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.969312 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-170.ec2.internal\" not found" node="ip-10-0-131-170.ec2.internal" Apr 21 02:41:07.986692 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:07.986675 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:07.999249 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.999234 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6a9357b3541c0f034bdee512e5b740bf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal\" (UID: \"6a9357b3541c0f034bdee512e5b740bf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" Apr 21 02:41:07.999337 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.999256 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a9357b3541c0f034bdee512e5b740bf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal\" (UID: \"6a9357b3541c0f034bdee512e5b740bf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" Apr 21 02:41:07.999337 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:07.999274 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/59358f97ed8d625f16c9a2fd8e43b833-config\") pod \"kube-apiserver-proxy-ip-10-0-131-170.ec2.internal\" (UID: \"59358f97ed8d625f16c9a2fd8e43b833\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.087159 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:08.087141 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:08.100248 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.100209 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a9357b3541c0f034bdee512e5b740bf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal\" (UID: \"6a9357b3541c0f034bdee512e5b740bf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.100248 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.100235 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/59358f97ed8d625f16c9a2fd8e43b833-config\") pod \"kube-apiserver-proxy-ip-10-0-131-170.ec2.internal\" (UID: \"59358f97ed8d625f16c9a2fd8e43b833\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.100340 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.100250 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6a9357b3541c0f034bdee512e5b740bf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal\" (UID: \"6a9357b3541c0f034bdee512e5b740bf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.100340 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.100302 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6a9357b3541c0f034bdee512e5b740bf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal\" (UID: \"6a9357b3541c0f034bdee512e5b740bf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.100340 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.100305 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a9357b3541c0f034bdee512e5b740bf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal\" (UID: \"6a9357b3541c0f034bdee512e5b740bf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.100340 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.100314 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/59358f97ed8d625f16c9a2fd8e43b833-config\") pod \"kube-apiserver-proxy-ip-10-0-131-170.ec2.internal\" (UID: \"59358f97ed8d625f16c9a2fd8e43b833\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.187639 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:08.187619 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:08.267049 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.267031 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.271508 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.271485 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.287959 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:08.287942 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:08.388687 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:08.388630 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:08.489130 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:08.489101 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:08.532018 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.532000 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:08.589878 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:08.589855 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:08.596961 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.596940 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 02:41:08.597081 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.597062 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 02:41:08.597132 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.597094 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 02:41:08.625434 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.625408 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:08.690620 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:08.690556 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:08.697347 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.697331 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 02:41:08.706210 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.706193 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 02:41:08.711979 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.711954 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 02:36:07 +0000 UTC" deadline="2027-10-23 10:34:06.916410707 +0000 UTC" Apr 21 02:41:08.712056 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.711980 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13207h52m58.204433989s" Apr 21 02:41:08.727465 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.727446 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-v2zm4" Apr 21 02:41:08.734205 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.734187 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-v2zm4" Apr 21 02:41:08.791562 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:08.791544 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-170.ec2.internal\" not found" Apr 21 02:41:08.799958 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:08.799933 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a9357b3541c0f034bdee512e5b740bf.slice/crio-a91d732932345e2bfdd44cce3228ab95585a36f78a5ea1ead4666acabbc0a2c2 WatchSource:0}: Error finding container a91d732932345e2bfdd44cce3228ab95585a36f78a5ea1ead4666acabbc0a2c2: Status 404 returned error can't find the container with id a91d732932345e2bfdd44cce3228ab95585a36f78a5ea1ead4666acabbc0a2c2 Apr 21 02:41:08.800286 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:08.800263 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59358f97ed8d625f16c9a2fd8e43b833.slice/crio-14dd4f5a4191944bc14e0b864caa6b6936802a98f3680397cae42519ae3b076f WatchSource:0}: Error finding container 14dd4f5a4191944bc14e0b864caa6b6936802a98f3680397cae42519ae3b076f: Status 404 returned error can't find the container with id 14dd4f5a4191944bc14e0b864caa6b6936802a98f3680397cae42519ae3b076f Apr 21 02:41:08.803627 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.803611 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:41:08.841856 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.841834 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:08.842608 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.842568 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal" event={"ID":"59358f97ed8d625f16c9a2fd8e43b833","Type":"ContainerStarted","Data":"14dd4f5a4191944bc14e0b864caa6b6936802a98f3680397cae42519ae3b076f"} Apr 21 02:41:08.843443 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.843425 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" event={"ID":"6a9357b3541c0f034bdee512e5b740bf","Type":"ContainerStarted","Data":"a91d732932345e2bfdd44cce3228ab95585a36f78a5ea1ead4666acabbc0a2c2"} Apr 21 02:41:08.897455 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.897441 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.908051 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.908033 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 02:41:08.908968 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.908957 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal" Apr 21 02:41:08.916381 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:08.916370 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 02:41:09.515864 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.515836 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:09.676900 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.676867 2564 apiserver.go:52] "Watching apiserver" Apr 21 02:41:09.683877 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.683845 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 02:41:09.685690 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.685663 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal","openshift-dns/node-resolver-w659n","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal","openshift-multus/multus-additional-cni-plugins-2vcn8","openshift-multus/multus-v22pg","openshift-network-diagnostics/network-check-target-5wsx7","openshift-network-operator/iptables-alerter-8qnnq","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk","openshift-cluster-node-tuning-operator/tuned-w4gln","openshift-image-registry/node-ca-f97sw","openshift-multus/network-metrics-daemon-77bqp","openshift-ovn-kubernetes/ovnkube-node-p8b6t","kube-system/konnectivity-agent-zf9vv"] Apr 21 02:41:09.688257 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.688234 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.688357 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.688328 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.689313 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.689293 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:09.689426 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:09.689369 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:09.690384 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.690354 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.690770 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.690745 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-4wd74\"" Apr 21 02:41:09.690926 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.690863 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.691026 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.690944 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zcs4l\"" Apr 21 02:41:09.691026 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.690966 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.691194 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.691174 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.691283 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.691258 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.692423 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.692405 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.692540 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.692479 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 02:41:09.692540 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.692528 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.692993 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.692936 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 02:41:09.692993 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.692976 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rmwfk\"" Apr 21 02:41:09.693137 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.692944 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 02:41:09.693137 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.692938 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.693137 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.693117 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.693664 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.693645 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.694390 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.694371 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 02:41:09.694477 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.694373 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.694717 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.694700 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:09.694796 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:09.694758 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:09.694858 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.694817 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 02:41:09.694858 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.694826 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.694957 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.694828 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fhqk2\"" Apr 21 02:41:09.695009 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.694959 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-v66n5\"" Apr 21 02:41:09.695636 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.695615 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 02:41:09.695709 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.695637 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.695766 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.695726 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.695816 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.695780 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.696060 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.696039 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9jt4z\"" Apr 21 02:41:09.697114 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.697097 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.697709 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.697683 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 02:41:09.697794 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.697718 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.698037 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.698020 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.698123 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.698046 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-88fsd\"" Apr 21 02:41:09.698710 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.698692 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:09.699199 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.699181 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.699592 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.699572 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 02:41:09.699684 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.699578 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jht6t\"" Apr 21 02:41:09.699684 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.699645 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.699799 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.699765 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 02:41:09.699954 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.699938 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 02:41:09.700043 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.700028 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 02:41:09.701161 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.700989 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-spxj6\"" Apr 21 02:41:09.701161 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.700997 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 02:41:09.701161 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.701103 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 02:41:09.708116 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708089 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.708116 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708124 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5071ecdc-0d05-412f-b12d-1289b06373ec-ovnkube-script-lib\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.708251 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708161 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ac41a7d4-62b5-4566-bcb2-cd59838f5120-konnectivity-ca\") pod \"konnectivity-agent-zf9vv\" (UID: \"ac41a7d4-62b5-4566-bcb2-cd59838f5120\") " pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:09.708251 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708194 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-lib-modules\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.708251 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708220 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78ee76ed-3959-4c8b-8f2c-d057e4bd15db-host\") pod \"node-ca-f97sw\" (UID: \"78ee76ed-3959-4c8b-8f2c-d057e4bd15db\") " pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.708362 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708255 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-run-systemd\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.708362 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708270 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-var-lib-openvswitch\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.708362 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708286 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.708362 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708335 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-cni-netd\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.708526 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708383 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-sysconfig\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.708526 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708408 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-run-ovn\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.708526 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708431 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-cni-bin\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.708526 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708454 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21474b65-5da1-4c90-a925-122f9bff65b7-host-slash\") pod \"iptables-alerter-8qnnq\" (UID: \"21474b65-5da1-4c90-a925-122f9bff65b7\") " pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.708906 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708476 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-systemd\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.708965 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708933 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2e0c65b-c05f-485b-b328-922109306697-tmp\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.709000 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.708972 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-var-lib-cni-multus\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.709032 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709006 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-multus-conf-dir\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.709063 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709039 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-socket-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.709098 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709071 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4l69\" (UniqueName: \"kubernetes.io/projected/78ee76ed-3959-4c8b-8f2c-d057e4bd15db-kube-api-access-f4l69\") pod \"node-ca-f97sw\" (UID: \"78ee76ed-3959-4c8b-8f2c-d057e4bd15db\") " pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.709148 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709096 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-run-netns\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.709148 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709128 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5071ecdc-0d05-412f-b12d-1289b06373ec-ovnkube-config\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.709243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709157 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/21474b65-5da1-4c90-a925-122f9bff65b7-iptables-alerter-script\") pod \"iptables-alerter-8qnnq\" (UID: \"21474b65-5da1-4c90-a925-122f9bff65b7\") " pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.709243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709186 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2gnf\" (UniqueName: \"kubernetes.io/projected/21474b65-5da1-4c90-a925-122f9bff65b7-kube-api-access-t2gnf\") pod \"iptables-alerter-8qnnq\" (UID: \"21474b65-5da1-4c90-a925-122f9bff65b7\") " pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.709337 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709321 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-kubernetes\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.709388 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709366 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-run-netns\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.709439 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709398 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9cb70894-b83f-4e21-9932-c5cb64320169-multus-daemon-config\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.709439 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709428 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-etc-kubernetes\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.709550 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709457 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-kubelet\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.709550 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709491 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87blm\" (UniqueName: \"kubernetes.io/projected/5071ecdc-0d05-412f-b12d-1289b06373ec-kube-api-access-87blm\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.709648 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709566 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-var-lib-cni-bin\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.709648 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709633 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-registration-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.709748 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709713 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-log-socket\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.709800 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709759 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-sysctl-d\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.709849 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709813 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqmb\" (UniqueName: \"kubernetes.io/projected/4a693e96-2533-450c-a8c8-de3f4cdfcd73-kube-api-access-cpqmb\") pod \"node-resolver-w659n\" (UID: \"4a693e96-2533-450c-a8c8-de3f4cdfcd73\") " pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.709900 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709847 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9fa5381-4556-4f6c-91d2-c5b4580df414-cni-binary-copy\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.709900 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709880 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d9fa5381-4556-4f6c-91d2-c5b4580df414-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.710012 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709938 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxnfv\" (UniqueName: \"kubernetes.io/projected/9cb70894-b83f-4e21-9932-c5cb64320169-kube-api-access-gxnfv\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.710012 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.709998 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.710110 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710026 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-etc-selinux\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.710110 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710075 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-modprobe-d\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.710210 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710110 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmlrw\" (UniqueName: \"kubernetes.io/projected/d9fa5381-4556-4f6c-91d2-c5b4580df414-kube-api-access-bmlrw\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.710210 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710158 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-multus-cni-dir\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.710210 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710189 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-device-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.710356 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710233 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfk6t\" (UniqueName: \"kubernetes.io/projected/cd704587-7472-452b-983f-e375dbc728cc-kube-api-access-hfk6t\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.710356 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710266 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/78ee76ed-3959-4c8b-8f2c-d057e4bd15db-serviceca\") pod \"node-ca-f97sw\" (UID: \"78ee76ed-3959-4c8b-8f2c-d057e4bd15db\") " pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.710356 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710311 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-run\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.710356 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710346 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-multus-socket-dir-parent\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.710562 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710392 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4a693e96-2533-450c-a8c8-de3f4cdfcd73-tmp-dir\") pod \"node-resolver-w659n\" (UID: \"4a693e96-2533-450c-a8c8-de3f4cdfcd73\") " pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.710562 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710420 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-cnibin\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.710562 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710467 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9cb70894-b83f-4e21-9932-c5cb64320169-cni-binary-copy\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.710562 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710521 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-run-k8s-cni-cncf-io\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.710754 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710559 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-run-openvswitch\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.710754 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710607 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e2e0c65b-c05f-485b-b328-922109306697-etc-tuned\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.710754 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710639 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-system-cni-dir\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.710754 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710687 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-sys-fs\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.710754 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710717 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5071ecdc-0d05-412f-b12d-1289b06373ec-env-overrides\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.710997 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710765 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgcqh\" (UniqueName: \"kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh\") pod \"network-check-target-5wsx7\" (UID: \"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e\") " pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:09.710997 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710798 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ac41a7d4-62b5-4566-bcb2-cd59838f5120-agent-certs\") pod \"konnectivity-agent-zf9vv\" (UID: \"ac41a7d4-62b5-4566-bcb2-cd59838f5120\") " pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:09.710997 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710847 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-sysctl-conf\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.710997 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710879 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-sys\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.710997 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710921 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:09.710997 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.710953 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-var-lib-kubelet\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.711281 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711000 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-system-cni-dir\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.711281 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711032 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-cnibin\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.711281 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711078 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-hostroot\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.711281 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711106 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-systemd-units\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.711281 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711152 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-slash\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.711281 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711185 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvfwr\" (UniqueName: \"kubernetes.io/projected/da0d17ad-bc94-4499-bb04-b7e0df549a24-kube-api-access-dvfwr\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:09.711281 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711231 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4a693e96-2533-450c-a8c8-de3f4cdfcd73-hosts-file\") pod \"node-resolver-w659n\" (UID: \"4a693e96-2533-450c-a8c8-de3f4cdfcd73\") " pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.711281 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711264 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-os-release\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.711687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711305 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-run-multus-certs\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.711687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711337 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-node-log\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.711687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711386 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5071ecdc-0d05-412f-b12d-1289b06373ec-ovn-node-metrics-cert\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.711687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711417 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-host\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.711687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711463 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.711687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711530 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d9fa5381-4556-4f6c-91d2-c5b4580df414-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.711687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711567 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-etc-openvswitch\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.711687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711612 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbtv2\" (UniqueName: \"kubernetes.io/projected/e2e0c65b-c05f-485b-b328-922109306697-kube-api-access-fbtv2\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.711687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711644 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-os-release\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.712084 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.711691 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-var-lib-kubelet\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.735248 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.735225 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 02:36:08 +0000 UTC" deadline="2027-12-01 07:23:00.917437932 +0000 UTC" Apr 21 02:41:09.735248 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.735245 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14140h41m51.18219505s" Apr 21 02:41:09.798518 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.798448 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 02:41:09.811882 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.811856 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-os-release\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.811986 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.811897 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-run-multus-certs\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.811986 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.811922 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-node-log\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.811986 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.811947 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5071ecdc-0d05-412f-b12d-1289b06373ec-ovn-node-metrics-cert\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.811986 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.811969 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-host\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.812192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.811989 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-node-log\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.812192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812008 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-os-release\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.812192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.811984 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.812192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d9fa5381-4556-4f6c-91d2-c5b4580df414-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.812192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812075 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-host\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.812192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812095 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-etc-openvswitch\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.812192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812112 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-run-multus-certs\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.812192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812115 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.812192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812121 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbtv2\" (UniqueName: \"kubernetes.io/projected/e2e0c65b-c05f-485b-b328-922109306697-kube-api-access-fbtv2\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812245 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-os-release\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812278 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-var-lib-kubelet\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812283 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812309 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812337 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5071ecdc-0d05-412f-b12d-1289b06373ec-ovnkube-script-lib\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812350 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-var-lib-kubelet\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812362 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ac41a7d4-62b5-4566-bcb2-cd59838f5120-konnectivity-ca\") pod \"konnectivity-agent-zf9vv\" (UID: \"ac41a7d4-62b5-4566-bcb2-cd59838f5120\") " pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812386 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-lib-modules\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812411 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78ee76ed-3959-4c8b-8f2c-d057e4bd15db-host\") pod \"node-ca-f97sw\" (UID: \"78ee76ed-3959-4c8b-8f2c-d057e4bd15db\") " pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812538 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812581 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-run-systemd\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812593 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-lib-modules\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.812612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812610 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-var-lib-openvswitch\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812637 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812632 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-etc-openvswitch\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812652 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d9fa5381-4556-4f6c-91d2-c5b4580df414-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812715 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-run-systemd\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812717 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78ee76ed-3959-4c8b-8f2c-d057e4bd15db-host\") pod \"node-ca-f97sw\" (UID: \"78ee76ed-3959-4c8b-8f2c-d057e4bd15db\") " pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812742 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-cni-netd\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812766 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-var-lib-openvswitch\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812771 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812769 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-sysconfig\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812807 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-cni-netd\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812824 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-run-ovn\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812828 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-os-release\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812855 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-sysconfig\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812893 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-run-ovn\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812896 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-cni-bin\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812930 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21474b65-5da1-4c90-a925-122f9bff65b7-host-slash\") pod \"iptables-alerter-8qnnq\" (UID: \"21474b65-5da1-4c90-a925-122f9bff65b7\") " pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812955 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-systemd\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.813177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812971 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-cni-bin\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.812998 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2e0c65b-c05f-485b-b328-922109306697-tmp\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813019 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21474b65-5da1-4c90-a925-122f9bff65b7-host-slash\") pod \"iptables-alerter-8qnnq\" (UID: \"21474b65-5da1-4c90-a925-122f9bff65b7\") " pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813026 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-var-lib-cni-multus\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813048 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-systemd\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813052 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-multus-conf-dir\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813083 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-socket-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813106 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4l69\" (UniqueName: \"kubernetes.io/projected/78ee76ed-3959-4c8b-8f2c-d057e4bd15db-kube-api-access-f4l69\") pod \"node-ca-f97sw\" (UID: \"78ee76ed-3959-4c8b-8f2c-d057e4bd15db\") " pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813131 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-run-netns\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813146 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-multus-conf-dir\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813155 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5071ecdc-0d05-412f-b12d-1289b06373ec-ovnkube-config\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813179 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5071ecdc-0d05-412f-b12d-1289b06373ec-ovnkube-script-lib\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813190 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-var-lib-cni-multus\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813181 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/21474b65-5da1-4c90-a925-122f9bff65b7-iptables-alerter-script\") pod \"iptables-alerter-8qnnq\" (UID: \"21474b65-5da1-4c90-a925-122f9bff65b7\") " pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813106 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ac41a7d4-62b5-4566-bcb2-cd59838f5120-konnectivity-ca\") pod \"konnectivity-agent-zf9vv\" (UID: \"ac41a7d4-62b5-4566-bcb2-cd59838f5120\") " pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813226 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2gnf\" (UniqueName: \"kubernetes.io/projected/21474b65-5da1-4c90-a925-122f9bff65b7-kube-api-access-t2gnf\") pod \"iptables-alerter-8qnnq\" (UID: \"21474b65-5da1-4c90-a925-122f9bff65b7\") " pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813247 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-run-netns\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.814014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813253 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-kubernetes\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813279 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-socket-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813293 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-run-netns\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9cb70894-b83f-4e21-9932-c5cb64320169-multus-daemon-config\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813340 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-etc-kubernetes\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813364 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-kubelet\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813392 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87blm\" (UniqueName: \"kubernetes.io/projected/5071ecdc-0d05-412f-b12d-1289b06373ec-kube-api-access-87blm\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813416 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-var-lib-cni-bin\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813427 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-run-netns\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813440 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-registration-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813468 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-log-socket\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813518 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-sysctl-d\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813544 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqmb\" (UniqueName: \"kubernetes.io/projected/4a693e96-2533-450c-a8c8-de3f4cdfcd73-kube-api-access-cpqmb\") pod \"node-resolver-w659n\" (UID: \"4a693e96-2533-450c-a8c8-de3f4cdfcd73\") " pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813568 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9fa5381-4556-4f6c-91d2-c5b4580df414-cni-binary-copy\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813565 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-kubernetes\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813596 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d9fa5381-4556-4f6c-91d2-c5b4580df414-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813625 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxnfv\" (UniqueName: \"kubernetes.io/projected/9cb70894-b83f-4e21-9932-c5cb64320169-kube-api-access-gxnfv\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.814632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813632 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-var-lib-cni-bin\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813652 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813676 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-etc-selinux\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813715 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-kubelet\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813729 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-modprobe-d\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813743 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5071ecdc-0d05-412f-b12d-1289b06373ec-ovnkube-config\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813754 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmlrw\" (UniqueName: \"kubernetes.io/projected/d9fa5381-4556-4f6c-91d2-c5b4580df414-kube-api-access-bmlrw\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813769 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-etc-kubernetes\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813778 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-multus-cni-dir\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813803 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-device-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813790 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/21474b65-5da1-4c90-a925-122f9bff65b7-iptables-alerter-script\") pod \"iptables-alerter-8qnnq\" (UID: \"21474b65-5da1-4c90-a925-122f9bff65b7\") " pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813870 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-registration-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813899 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfk6t\" (UniqueName: \"kubernetes.io/projected/cd704587-7472-452b-983f-e375dbc728cc-kube-api-access-hfk6t\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813925 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/78ee76ed-3959-4c8b-8f2c-d057e4bd15db-serviceca\") pod \"node-ca-f97sw\" (UID: \"78ee76ed-3959-4c8b-8f2c-d057e4bd15db\") " pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813959 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-run\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813994 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-multus-socket-dir-parent\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814014 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-multus-cni-dir\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.815426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814058 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-log-socket\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814159 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-sysctl-d\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814280 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4a693e96-2533-450c-a8c8-de3f4cdfcd73-tmp-dir\") pod \"node-resolver-w659n\" (UID: \"4a693e96-2533-450c-a8c8-de3f4cdfcd73\") " pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814372 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-device-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814018 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4a693e96-2533-450c-a8c8-de3f4cdfcd73-tmp-dir\") pod \"node-resolver-w659n\" (UID: \"4a693e96-2533-450c-a8c8-de3f4cdfcd73\") " pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814413 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-cnibin\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814424 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d9fa5381-4556-4f6c-91d2-c5b4580df414-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814443 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9cb70894-b83f-4e21-9932-c5cb64320169-cni-binary-copy\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814470 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-run-k8s-cni-cncf-io\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814517 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-run-openvswitch\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814543 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e2e0c65b-c05f-485b-b328-922109306697-etc-tuned\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814567 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-system-cni-dir\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814591 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-sys-fs\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814617 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5071ecdc-0d05-412f-b12d-1289b06373ec-env-overrides\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814609 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-host-run-k8s-cni-cncf-io\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814643 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcqh\" (UniqueName: \"kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh\") pod \"network-check-target-5wsx7\" (UID: \"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e\") " pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814665 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9cb70894-b83f-4e21-9932-c5cb64320169-multus-daemon-config\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.816240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814693 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ac41a7d4-62b5-4566-bcb2-cd59838f5120-agent-certs\") pod \"konnectivity-agent-zf9vv\" (UID: \"ac41a7d4-62b5-4566-bcb2-cd59838f5120\") " pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814709 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-etc-selinux\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814720 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-sysctl-conf\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814743 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-sys\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814771 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814785 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-cnibin\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814796 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-var-lib-kubelet\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814796 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9fa5381-4556-4f6c-91d2-c5b4580df414-cni-binary-copy\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.813916 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-modprobe-d\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814823 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-system-cni-dir\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814850 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-cnibin\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-run\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814867 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-run-openvswitch\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814873 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-hostroot\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814899 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-systemd-units\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814915 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-multus-socket-dir-parent\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814941 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-slash\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:09.814953 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:09.817099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814970 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvfwr\" (UniqueName: \"kubernetes.io/projected/da0d17ad-bc94-4499-bb04-b7e0df549a24-kube-api-access-dvfwr\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.814997 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4a693e96-2533-450c-a8c8-de3f4cdfcd73-hosts-file\") pod \"node-resolver-w659n\" (UID: \"4a693e96-2533-450c-a8c8-de3f4cdfcd73\") " pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:09.815049 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs podName:da0d17ad-bc94-4499-bb04-b7e0df549a24 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:10.314999136 +0000 UTC m=+3.082487837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs") pod "network-metrics-daemon-77bqp" (UID: "da0d17ad-bc94-4499-bb04-b7e0df549a24") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815059 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/78ee76ed-3959-4c8b-8f2c-d057e4bd15db-serviceca\") pod \"node-ca-f97sw\" (UID: \"78ee76ed-3959-4c8b-8f2c-d057e4bd15db\") " pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815084 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4a693e96-2533-450c-a8c8-de3f4cdfcd73-hosts-file\") pod \"node-resolver-w659n\" (UID: \"4a693e96-2533-450c-a8c8-de3f4cdfcd73\") " pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815099 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-sys\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815101 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-etc-sysctl-conf\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815134 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2e0c65b-c05f-485b-b328-922109306697-var-lib-kubelet\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815181 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9fa5381-4556-4f6c-91d2-c5b4580df414-system-cni-dir\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815221 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-host-slash\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815283 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5071ecdc-0d05-412f-b12d-1289b06373ec-systemd-units\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815314 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-cnibin\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815341 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-hostroot\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815368 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-sys-fs\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815395 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9cb70894-b83f-4e21-9932-c5cb64320169-system-cni-dir\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815440 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd704587-7472-452b-983f-e375dbc728cc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815547 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9cb70894-b83f-4e21-9932-c5cb64320169-cni-binary-copy\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.817706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815768 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5071ecdc-0d05-412f-b12d-1289b06373ec-env-overrides\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.818462 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.815998 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2e0c65b-c05f-485b-b328-922109306697-tmp\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.818462 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.816146 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5071ecdc-0d05-412f-b12d-1289b06373ec-ovn-node-metrics-cert\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.818462 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.817154 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e2e0c65b-c05f-485b-b328-922109306697-etc-tuned\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.818462 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.817726 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ac41a7d4-62b5-4566-bcb2-cd59838f5120-agent-certs\") pod \"konnectivity-agent-zf9vv\" (UID: \"ac41a7d4-62b5-4566-bcb2-cd59838f5120\") " pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:09.819913 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:09.819477 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:09.819913 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:09.819518 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:09.819913 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:09.819533 2564 projected.go:194] Error preparing data for projected volume kube-api-access-vgcqh for pod openshift-network-diagnostics/network-check-target-5wsx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:09.819913 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:09.819587 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh podName:91b77933-1a1d-4dc7-8c31-f6c98c4bea6e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:10.319569923 +0000 UTC m=+3.087058600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vgcqh" (UniqueName: "kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh") pod "network-check-target-5wsx7" (UID: "91b77933-1a1d-4dc7-8c31-f6c98c4bea6e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:09.819913 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.819874 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbtv2\" (UniqueName: \"kubernetes.io/projected/e2e0c65b-c05f-485b-b328-922109306697-kube-api-access-fbtv2\") pod \"tuned-w4gln\" (UID: \"e2e0c65b-c05f-485b-b328-922109306697\") " pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:09.820892 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.820770 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4l69\" (UniqueName: \"kubernetes.io/projected/78ee76ed-3959-4c8b-8f2c-d057e4bd15db-kube-api-access-f4l69\") pod \"node-ca-f97sw\" (UID: \"78ee76ed-3959-4c8b-8f2c-d057e4bd15db\") " pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:09.821612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.821580 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2gnf\" (UniqueName: \"kubernetes.io/projected/21474b65-5da1-4c90-a925-122f9bff65b7-kube-api-access-t2gnf\") pod \"iptables-alerter-8qnnq\" (UID: \"21474b65-5da1-4c90-a925-122f9bff65b7\") " pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:09.822392 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.822359 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87blm\" (UniqueName: \"kubernetes.io/projected/5071ecdc-0d05-412f-b12d-1289b06373ec-kube-api-access-87blm\") pod \"ovnkube-node-p8b6t\" (UID: \"5071ecdc-0d05-412f-b12d-1289b06373ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:09.822476 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.822456 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmlrw\" (UniqueName: \"kubernetes.io/projected/d9fa5381-4556-4f6c-91d2-c5b4580df414-kube-api-access-bmlrw\") pod \"multus-additional-cni-plugins-2vcn8\" (UID: \"d9fa5381-4556-4f6c-91d2-c5b4580df414\") " pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:09.823241 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.823222 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqmb\" (UniqueName: \"kubernetes.io/projected/4a693e96-2533-450c-a8c8-de3f4cdfcd73-kube-api-access-cpqmb\") pod \"node-resolver-w659n\" (UID: \"4a693e96-2533-450c-a8c8-de3f4cdfcd73\") " pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:09.823550 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.823531 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxnfv\" (UniqueName: \"kubernetes.io/projected/9cb70894-b83f-4e21-9932-c5cb64320169-kube-api-access-gxnfv\") pod \"multus-v22pg\" (UID: \"9cb70894-b83f-4e21-9932-c5cb64320169\") " pod="openshift-multus/multus-v22pg" Apr 21 02:41:09.824178 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.824149 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvfwr\" (UniqueName: \"kubernetes.io/projected/da0d17ad-bc94-4499-bb04-b7e0df549a24-kube-api-access-dvfwr\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:09.824178 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:09.824173 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfk6t\" (UniqueName: \"kubernetes.io/projected/cd704587-7472-452b-983f-e375dbc728cc-kube-api-access-hfk6t\") pod \"aws-ebs-csi-driver-node-nw7mk\" (UID: \"cd704587-7472-452b-983f-e375dbc728cc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:10.000650 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.000614 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w4gln" Apr 21 02:41:10.014218 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.014184 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w659n" Apr 21 02:41:10.022885 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.022864 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" Apr 21 02:41:10.027472 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.027455 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v22pg" Apr 21 02:41:10.035054 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.035036 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8qnnq" Apr 21 02:41:10.042614 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.042596 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" Apr 21 02:41:10.048162 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.048145 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f97sw" Apr 21 02:41:10.054684 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.054667 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:10.059276 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.059258 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:10.317914 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.317840 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:10.318057 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:10.317975 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:10.318057 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:10.318043 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs podName:da0d17ad-bc94-4499-bb04-b7e0df549a24 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:11.318021331 +0000 UTC m=+4.085510041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs") pod "network-metrics-daemon-77bqp" (UID: "da0d17ad-bc94-4499-bb04-b7e0df549a24") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:10.386471 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:10.386253 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd704587_7472_452b_983f_e375dbc728cc.slice/crio-c289308c61e93213107293d90a65f6d0ba4dd2339953359cd0dc754358ba026e WatchSource:0}: Error finding container c289308c61e93213107293d90a65f6d0ba4dd2339953359cd0dc754358ba026e: Status 404 returned error can't find the container with id c289308c61e93213107293d90a65f6d0ba4dd2339953359cd0dc754358ba026e Apr 21 02:41:10.387721 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:10.387682 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cb70894_b83f_4e21_9932_c5cb64320169.slice/crio-842f302daba82d5ba2c459b2c720a5a33b7c1e59cf4493673f40928e50799a96 WatchSource:0}: Error finding container 842f302daba82d5ba2c459b2c720a5a33b7c1e59cf4493673f40928e50799a96: Status 404 returned error can't find the container with id 842f302daba82d5ba2c459b2c720a5a33b7c1e59cf4493673f40928e50799a96 Apr 21 02:41:10.388469 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:10.388441 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9fa5381_4556_4f6c_91d2_c5b4580df414.slice/crio-4d57cc2032e6c5a9bfc341e379c6e304d3249001a004fceb366a77bf340de8ce WatchSource:0}: Error finding container 4d57cc2032e6c5a9bfc341e379c6e304d3249001a004fceb366a77bf340de8ce: Status 404 returned error can't find the container with id 4d57cc2032e6c5a9bfc341e379c6e304d3249001a004fceb366a77bf340de8ce Apr 21 02:41:10.389039 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:10.389017 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac41a7d4_62b5_4566_bcb2_cd59838f5120.slice/crio-fce92b821aaad3b1d44704c4c742fde8943c660865e7ad4a405a29beb82f1043 WatchSource:0}: Error finding container fce92b821aaad3b1d44704c4c742fde8943c660865e7ad4a405a29beb82f1043: Status 404 returned error can't find the container with id fce92b821aaad3b1d44704c4c742fde8943c660865e7ad4a405a29beb82f1043 Apr 21 02:41:10.391008 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:10.390689 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21474b65_5da1_4c90_a925_122f9bff65b7.slice/crio-71d74bd3101776ed5097af59ae233b685be6f6a04abc8b028c86c9404fe96546 WatchSource:0}: Error finding container 71d74bd3101776ed5097af59ae233b685be6f6a04abc8b028c86c9404fe96546: Status 404 returned error can't find the container with id 71d74bd3101776ed5097af59ae233b685be6f6a04abc8b028c86c9404fe96546 Apr 21 02:41:10.392949 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:10.392927 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a693e96_2533_450c_a8c8_de3f4cdfcd73.slice/crio-4981cfd1ce06df0768546dfce1946a17bf0b74225f2ed3871c75244a488d91ca WatchSource:0}: Error finding container 4981cfd1ce06df0768546dfce1946a17bf0b74225f2ed3871c75244a488d91ca: Status 404 returned error can't find the container with id 4981cfd1ce06df0768546dfce1946a17bf0b74225f2ed3871c75244a488d91ca Apr 21 02:41:10.393777 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:10.393753 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e0c65b_c05f_485b_b328_922109306697.slice/crio-ad5060960468cb79708aa76f7a738cc4cc9a1e54dc2a3e9345d1135f8389d50b WatchSource:0}: Error finding container ad5060960468cb79708aa76f7a738cc4cc9a1e54dc2a3e9345d1135f8389d50b: Status 404 returned error can't find the container with id ad5060960468cb79708aa76f7a738cc4cc9a1e54dc2a3e9345d1135f8389d50b Apr 21 02:41:10.395443 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:10.395327 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5071ecdc_0d05_412f_b12d_1289b06373ec.slice/crio-84e95e484252994b4991d98f0e2e81949239841d4c29f7354d68c49106e4a1ee WatchSource:0}: Error finding container 84e95e484252994b4991d98f0e2e81949239841d4c29f7354d68c49106e4a1ee: Status 404 returned error can't find the container with id 84e95e484252994b4991d98f0e2e81949239841d4c29f7354d68c49106e4a1ee Apr 21 02:41:10.396430 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:10.396406 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ee76ed_3959_4c8b_8f2c_d057e4bd15db.slice/crio-f722000eb3c74804a4eee6190c9dee7e1687bc567358b8b0ef71f1522e0c62b3 WatchSource:0}: Error finding container f722000eb3c74804a4eee6190c9dee7e1687bc567358b8b0ef71f1522e0c62b3: Status 404 returned error can't find the container with id f722000eb3c74804a4eee6190c9dee7e1687bc567358b8b0ef71f1522e0c62b3 Apr 21 02:41:10.418759 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.418736 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcqh\" (UniqueName: \"kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh\") pod \"network-check-target-5wsx7\" (UID: \"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e\") " pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:10.418862 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:10.418851 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:10.418899 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:10.418865 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:10.418899 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:10.418873 2564 projected.go:194] Error preparing data for projected volume kube-api-access-vgcqh for pod openshift-network-diagnostics/network-check-target-5wsx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:10.418961 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:10.418910 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh podName:91b77933-1a1d-4dc7-8c31-f6c98c4bea6e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:11.418898576 +0000 UTC m=+4.186387243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgcqh" (UniqueName: "kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh") pod "network-check-target-5wsx7" (UID: "91b77933-1a1d-4dc7-8c31-f6c98c4bea6e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:10.736426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.736324 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 02:36:08 +0000 UTC" deadline="2027-11-26 01:30:00.408203951 +0000 UTC" Apr 21 02:41:10.736426 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.736358 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14014h48m49.671849174s" Apr 21 02:41:10.851191 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.851122 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"84e95e484252994b4991d98f0e2e81949239841d4c29f7354d68c49106e4a1ee"} Apr 21 02:41:10.856474 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.856409 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w4gln" event={"ID":"e2e0c65b-c05f-485b-b328-922109306697","Type":"ContainerStarted","Data":"ad5060960468cb79708aa76f7a738cc4cc9a1e54dc2a3e9345d1135f8389d50b"} Apr 21 02:41:10.863931 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.863905 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8qnnq" event={"ID":"21474b65-5da1-4c90-a925-122f9bff65b7","Type":"ContainerStarted","Data":"71d74bd3101776ed5097af59ae233b685be6f6a04abc8b028c86c9404fe96546"} Apr 21 02:41:10.868302 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.868275 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zf9vv" event={"ID":"ac41a7d4-62b5-4566-bcb2-cd59838f5120","Type":"ContainerStarted","Data":"fce92b821aaad3b1d44704c4c742fde8943c660865e7ad4a405a29beb82f1043"} Apr 21 02:41:10.870799 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.870775 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" event={"ID":"d9fa5381-4556-4f6c-91d2-c5b4580df414","Type":"ContainerStarted","Data":"4d57cc2032e6c5a9bfc341e379c6e304d3249001a004fceb366a77bf340de8ce"} Apr 21 02:41:10.873739 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.873715 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v22pg" event={"ID":"9cb70894-b83f-4e21-9932-c5cb64320169","Type":"ContainerStarted","Data":"842f302daba82d5ba2c459b2c720a5a33b7c1e59cf4493673f40928e50799a96"} Apr 21 02:41:10.881836 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.881812 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w659n" event={"ID":"4a693e96-2533-450c-a8c8-de3f4cdfcd73","Type":"ContainerStarted","Data":"4981cfd1ce06df0768546dfce1946a17bf0b74225f2ed3871c75244a488d91ca"} Apr 21 02:41:10.887588 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.887562 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" event={"ID":"cd704587-7472-452b-983f-e375dbc728cc","Type":"ContainerStarted","Data":"c289308c61e93213107293d90a65f6d0ba4dd2339953359cd0dc754358ba026e"} Apr 21 02:41:10.905401 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.904811 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal" event={"ID":"59358f97ed8d625f16c9a2fd8e43b833","Type":"ContainerStarted","Data":"18fbde1da28acceaceeb542735c3bc01a93d3356e0f5d26353e621db9b9b9568"} Apr 21 02:41:10.911334 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:10.911181 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f97sw" event={"ID":"78ee76ed-3959-4c8b-8f2c-d057e4bd15db","Type":"ContainerStarted","Data":"f722000eb3c74804a4eee6190c9dee7e1687bc567358b8b0ef71f1522e0c62b3"} Apr 21 02:41:11.117293 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:11.117258 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:11.329014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:11.327071 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:11.329014 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:11.327209 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:11.329014 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:11.327272 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs podName:da0d17ad-bc94-4499-bb04-b7e0df549a24 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:13.327254169 +0000 UTC m=+6.094742841 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs") pod "network-metrics-daemon-77bqp" (UID: "da0d17ad-bc94-4499-bb04-b7e0df549a24") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:11.428664 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:11.427968 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcqh\" (UniqueName: \"kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh\") pod \"network-check-target-5wsx7\" (UID: \"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e\") " pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:11.428664 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:11.428151 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:11.428664 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:11.428169 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:11.428664 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:11.428181 2564 projected.go:194] Error preparing data for projected volume kube-api-access-vgcqh for pod openshift-network-diagnostics/network-check-target-5wsx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:11.428664 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:11.428237 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh podName:91b77933-1a1d-4dc7-8c31-f6c98c4bea6e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:13.428218587 +0000 UTC m=+6.195707261 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgcqh" (UniqueName: "kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh") pod "network-check-target-5wsx7" (UID: "91b77933-1a1d-4dc7-8c31-f6c98c4bea6e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:11.842376 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:11.841637 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:11.842376 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:11.841753 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:11.842376 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:11.842209 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:11.842376 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:11.842309 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:11.923168 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:11.922403 2564 generic.go:358] "Generic (PLEG): container finished" podID="6a9357b3541c0f034bdee512e5b740bf" containerID="cd477e496ea77bb0b984552d8bb78601f684e4e9c14ab005111a83993f4aa7f1" exitCode=0 Apr 21 02:41:11.923168 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:11.923125 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" event={"ID":"6a9357b3541c0f034bdee512e5b740bf","Type":"ContainerDied","Data":"cd477e496ea77bb0b984552d8bb78601f684e4e9c14ab005111a83993f4aa7f1"} Apr 21 02:41:11.936490 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:11.935720 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-170.ec2.internal" podStartSLOduration=3.935704649 podStartE2EDuration="3.935704649s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:41:10.921347527 +0000 UTC m=+3.688836217" watchObservedRunningTime="2026-04-21 02:41:11.935704649 +0000 UTC m=+4.703193340" Apr 21 02:41:12.933455 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:12.933413 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" event={"ID":"6a9357b3541c0f034bdee512e5b740bf","Type":"ContainerStarted","Data":"31fda71fed9d6af482428b2e9fa345aaf308661d4f6af5bd67e2a3b2f20a7e41"} Apr 21 02:41:13.342763 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:13.342730 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:13.342926 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:13.342856 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:13.342926 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:13.342908 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs podName:da0d17ad-bc94-4499-bb04-b7e0df549a24 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:17.342890259 +0000 UTC m=+10.110378940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs") pod "network-metrics-daemon-77bqp" (UID: "da0d17ad-bc94-4499-bb04-b7e0df549a24") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:13.443936 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:13.443898 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcqh\" (UniqueName: \"kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh\") pod \"network-check-target-5wsx7\" (UID: \"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e\") " pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:13.444113 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:13.444069 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:13.444113 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:13.444089 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:13.444113 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:13.444104 2564 projected.go:194] Error preparing data for projected volume kube-api-access-vgcqh for pod openshift-network-diagnostics/network-check-target-5wsx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:13.444269 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:13.444172 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh podName:91b77933-1a1d-4dc7-8c31-f6c98c4bea6e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:17.444150952 +0000 UTC m=+10.211639621 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgcqh" (UniqueName: "kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh") pod "network-check-target-5wsx7" (UID: "91b77933-1a1d-4dc7-8c31-f6c98c4bea6e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:13.841522 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:13.840805 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:13.841522 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:13.840828 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:13.841522 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:13.840953 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:13.841522 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:13.841093 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:15.841080 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:15.841047 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:15.841522 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:15.841189 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:15.841522 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:15.841432 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:15.841634 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:15.841551 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:17.376252 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:17.375690 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:17.376252 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:17.375831 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:17.376252 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:17.375892 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs podName:da0d17ad-bc94-4499-bb04-b7e0df549a24 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:25.375872859 +0000 UTC m=+18.143361530 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs") pod "network-metrics-daemon-77bqp" (UID: "da0d17ad-bc94-4499-bb04-b7e0df549a24") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:17.476893 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:17.476819 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcqh\" (UniqueName: \"kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh\") pod \"network-check-target-5wsx7\" (UID: \"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e\") " pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:17.477047 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:17.476976 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:17.477047 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:17.476997 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:17.477047 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:17.477009 2564 projected.go:194] Error preparing data for projected volume kube-api-access-vgcqh for pod openshift-network-diagnostics/network-check-target-5wsx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:17.477213 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:17.477065 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh podName:91b77933-1a1d-4dc7-8c31-f6c98c4bea6e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:25.47704804 +0000 UTC m=+18.244536707 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgcqh" (UniqueName: "kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh") pod "network-check-target-5wsx7" (UID: "91b77933-1a1d-4dc7-8c31-f6c98c4bea6e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:17.841995 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:17.841537 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:17.841995 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:17.841652 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:17.842248 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:17.842118 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:17.842248 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:17.842217 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:19.840838 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:19.840794 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:19.841226 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:19.840945 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:19.841226 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:19.841000 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:19.841226 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:19.841100 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:21.841128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:21.841091 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:21.841537 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:21.841091 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:21.841537 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:21.841239 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:21.841537 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:21.841309 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:23.841213 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:23.841183 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:23.841752 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:23.841312 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:23.841752 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:23.841379 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:23.841752 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:23.841458 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:25.432176 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:25.432141 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:25.432603 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:25.432276 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:25.432603 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:25.432338 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs podName:da0d17ad-bc94-4499-bb04-b7e0df549a24 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:41.432323503 +0000 UTC m=+34.199812170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs") pod "network-metrics-daemon-77bqp" (UID: "da0d17ad-bc94-4499-bb04-b7e0df549a24") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:25.532851 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:25.532820 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcqh\" (UniqueName: \"kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh\") pod \"network-check-target-5wsx7\" (UID: \"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e\") " pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:25.533014 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:25.532983 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:25.533073 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:25.533029 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:25.533073 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:25.533043 2564 projected.go:194] Error preparing data for projected volume kube-api-access-vgcqh for pod openshift-network-diagnostics/network-check-target-5wsx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:25.533179 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:25.533104 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh podName:91b77933-1a1d-4dc7-8c31-f6c98c4bea6e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:41.533084772 +0000 UTC m=+34.300573451 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgcqh" (UniqueName: "kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh") pod "network-check-target-5wsx7" (UID: "91b77933-1a1d-4dc7-8c31-f6c98c4bea6e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:25.844270 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:25.844241 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:25.844425 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:25.844241 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:25.844425 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:25.844357 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:25.844567 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:25.844485 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:27.843425 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.843283 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:27.843897 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.843283 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:27.843897 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:27.843522 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:27.843897 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:27.843583 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:27.958390 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.958365 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f97sw" event={"ID":"78ee76ed-3959-4c8b-8f2c-d057e4bd15db","Type":"ContainerStarted","Data":"995d3853d2b25439e9b11c1141787b584fe5384d4a1028a9f35e4af30005e4c6"} Apr 21 02:41:27.959967 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.959941 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"2086c54bd9030176e525754b523fc790ae7eb63dc183466c6cc00c367706236f"} Apr 21 02:41:27.960051 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.959971 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"89cad99d195e98aee47b6a4ca507d020c448d24e52b0b5c55806e1e41e24d863"} Apr 21 02:41:27.961018 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.960999 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w4gln" event={"ID":"e2e0c65b-c05f-485b-b328-922109306697","Type":"ContainerStarted","Data":"d81adc4e04ebada42fe1a81564090388e14e42bb93b77152b781e5ce258b259f"} Apr 21 02:41:27.962157 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.962131 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zf9vv" event={"ID":"ac41a7d4-62b5-4566-bcb2-cd59838f5120","Type":"ContainerStarted","Data":"f79249c270f7da6407638ececdc6317c99e4c18d7951696940649521c5e8154b"} Apr 21 02:41:27.963244 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.963223 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" event={"ID":"d9fa5381-4556-4f6c-91d2-c5b4580df414","Type":"ContainerStarted","Data":"179370889f95ce1b6ceca5aed8f9b06d2a552a1c4eed28e0375cbb64d9d8b268"} Apr 21 02:41:27.964341 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.964319 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v22pg" event={"ID":"9cb70894-b83f-4e21-9932-c5cb64320169","Type":"ContainerStarted","Data":"759a0165d770e723fd6839d73a21cec0e8fbdaf3b1b35cc8688508685c8b4d8b"} Apr 21 02:41:27.965343 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.965323 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w659n" event={"ID":"4a693e96-2533-450c-a8c8-de3f4cdfcd73","Type":"ContainerStarted","Data":"22e10f0cea92a0d081896f87d620bd45632457f9817b89ed99c137e3ee472622"} Apr 21 02:41:27.966572 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.966551 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" event={"ID":"cd704587-7472-452b-983f-e375dbc728cc","Type":"ContainerStarted","Data":"39498c531fd498a677bce146985a69b8b6d2b6bd4e57177d7f2b5401e1570c97"} Apr 21 02:41:27.971538 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.971489 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-f97sw" podStartSLOduration=2.897291573 podStartE2EDuration="19.971479417s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.398102953 +0000 UTC m=+3.165591619" lastFinishedPulling="2026-04-21 02:41:27.47229078 +0000 UTC m=+20.239779463" observedRunningTime="2026-04-21 02:41:27.971050666 +0000 UTC m=+20.738539355" watchObservedRunningTime="2026-04-21 02:41:27.971479417 +0000 UTC m=+20.738968105" Apr 21 02:41:27.971799 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.971780 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-170.ec2.internal" podStartSLOduration=19.971775924 podStartE2EDuration="19.971775924s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:41:12.964855033 +0000 UTC m=+5.732343724" watchObservedRunningTime="2026-04-21 02:41:27.971775924 +0000 UTC m=+20.739264613" Apr 21 02:41:27.985526 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.985466 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-w4gln" podStartSLOduration=3.909394749 podStartE2EDuration="20.985451992s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.396462514 +0000 UTC m=+3.163951195" lastFinishedPulling="2026-04-21 02:41:27.472519762 +0000 UTC m=+20.240008438" observedRunningTime="2026-04-21 02:41:27.985214554 +0000 UTC m=+20.752703237" watchObservedRunningTime="2026-04-21 02:41:27.985451992 +0000 UTC m=+20.752940683" Apr 21 02:41:27.999767 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:27.999735 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w659n" podStartSLOduration=3.921946028 podStartE2EDuration="20.999727119s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.394578341 +0000 UTC m=+3.162067023" lastFinishedPulling="2026-04-21 02:41:27.472359446 +0000 UTC m=+20.239848114" observedRunningTime="2026-04-21 02:41:27.999628867 +0000 UTC m=+20.767117556" watchObservedRunningTime="2026-04-21 02:41:27.999727119 +0000 UTC m=+20.767215804" Apr 21 02:41:28.017392 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.017358 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v22pg" podStartSLOduration=3.9153934809999997 podStartE2EDuration="21.017348405s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.389614333 +0000 UTC m=+3.157103015" lastFinishedPulling="2026-04-21 02:41:27.491569272 +0000 UTC m=+20.259057939" observedRunningTime="2026-04-21 02:41:28.016893377 +0000 UTC m=+20.784382087" watchObservedRunningTime="2026-04-21 02:41:28.017348405 +0000 UTC m=+20.784837094" Apr 21 02:41:28.047593 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.047565 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zf9vv" podStartSLOduration=2.966152582 podStartE2EDuration="20.047555552s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.390957504 +0000 UTC m=+3.158446175" lastFinishedPulling="2026-04-21 02:41:27.472360474 +0000 UTC m=+20.239849145" observedRunningTime="2026-04-21 02:41:28.047466838 +0000 UTC m=+20.814955527" watchObservedRunningTime="2026-04-21 02:41:28.047555552 +0000 UTC m=+20.815044240" Apr 21 02:41:28.768069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.767920 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 02:41:28.769547 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.769459 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T02:41:28.768066352Z","UUID":"fdcdb19c-f298-4a99-9c0c-59d74a94f42c","Handler":null,"Name":"","Endpoint":""} Apr 21 02:41:28.772272 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.772149 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 02:41:28.772358 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.772279 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 02:41:28.969959 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.969931 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" event={"ID":"cd704587-7472-452b-983f-e375dbc728cc","Type":"ContainerStarted","Data":"6c08f69f05a9a089225847a083f3eeea8840ef10f7f9f83870472a9d7ad7b7a5"} Apr 21 02:41:28.972291 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.972274 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:41:28.972613 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.972593 2564 generic.go:358] "Generic (PLEG): container finished" podID="5071ecdc-0d05-412f-b12d-1289b06373ec" containerID="2086c54bd9030176e525754b523fc790ae7eb63dc183466c6cc00c367706236f" exitCode=1 Apr 21 02:41:28.972681 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.972661 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerDied","Data":"2086c54bd9030176e525754b523fc790ae7eb63dc183466c6cc00c367706236f"} Apr 21 02:41:28.972729 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.972691 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"973b0bcdef4a74f8c570ab63386a20e095c6ba8488a2578e73205aa6e598ca1d"} Apr 21 02:41:28.972729 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.972703 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"dacb2b2b30ef20fea8627b9401d0afbb1d3fa8915160f772107a44333223584c"} Apr 21 02:41:28.972729 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.972713 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"d49f8ab5c83a2e6ff8671f7fee88810d6c0180c865c831b3daa5593c1ad16dcc"} Apr 21 02:41:28.972729 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.972721 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"7ba3c6926b035b3fee65d6ed1688336c3d08680a5027176de3d480f6620f8e4c"} Apr 21 02:41:28.973819 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.973796 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8qnnq" event={"ID":"21474b65-5da1-4c90-a925-122f9bff65b7","Type":"ContainerStarted","Data":"e50c77f3f0a12db2f13f172d945cba3ed53335fbcae29778d57d57ecb88383ad"} Apr 21 02:41:28.975114 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.975092 2564 generic.go:358] "Generic (PLEG): container finished" podID="d9fa5381-4556-4f6c-91d2-c5b4580df414" containerID="179370889f95ce1b6ceca5aed8f9b06d2a552a1c4eed28e0375cbb64d9d8b268" exitCode=0 Apr 21 02:41:28.975208 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.975116 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" event={"ID":"d9fa5381-4556-4f6c-91d2-c5b4580df414","Type":"ContainerDied","Data":"179370889f95ce1b6ceca5aed8f9b06d2a552a1c4eed28e0375cbb64d9d8b268"} Apr 21 02:41:28.987086 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:28.987047 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8qnnq" podStartSLOduration=3.904537448 podStartE2EDuration="20.987037797s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.392075694 +0000 UTC m=+3.159564370" lastFinishedPulling="2026-04-21 02:41:27.474576044 +0000 UTC m=+20.242064719" observedRunningTime="2026-04-21 02:41:28.986620846 +0000 UTC m=+21.754109535" watchObservedRunningTime="2026-04-21 02:41:28.987037797 +0000 UTC m=+21.754526522" Apr 21 02:41:29.840508 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:29.840473 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:29.840647 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:29.840612 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:29.840711 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:29.840653 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:29.840766 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:29.840728 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:29.978544 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:29.978450 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" event={"ID":"cd704587-7472-452b-983f-e375dbc728cc","Type":"ContainerStarted","Data":"8e948e3ebc7251256dcf60d7237998f256d3d724d8de4ea93b0b41f303a21250"} Apr 21 02:41:29.996657 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:29.996620 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nw7mk" podStartSLOduration=2.713871796 podStartE2EDuration="21.996607841s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.387990936 +0000 UTC m=+3.155479616" lastFinishedPulling="2026-04-21 02:41:29.670726988 +0000 UTC m=+22.438215661" observedRunningTime="2026-04-21 02:41:29.996304543 +0000 UTC m=+22.763793237" watchObservedRunningTime="2026-04-21 02:41:29.996607841 +0000 UTC m=+22.764096550" Apr 21 02:41:30.983922 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:30.983893 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:41:30.984436 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:30.984277 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"5af0e03a41526dce76a4a9424151f34053e4f2f70439de5ae968748e2f409e1f"} Apr 21 02:41:31.840982 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:31.840952 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:31.841164 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:31.841062 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:31.841267 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:31.840952 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:31.841404 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:31.841377 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:32.768716 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:32.768683 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:32.769305 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:32.769239 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:32.992672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:32.992288 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:41:32.993015 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:32.992887 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"e735e73c0ebc202cc5a2280609f42429e079147480559d97ee71b42115ea0529"} Apr 21 02:41:32.993255 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:32.993083 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:32.993640 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:32.993440 2564 scope.go:117] "RemoveContainer" containerID="2086c54bd9030176e525754b523fc790ae7eb63dc183466c6cc00c367706236f" Apr 21 02:41:32.993975 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:32.993823 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zf9vv" Apr 21 02:41:33.840451 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:33.840413 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:33.841070 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:33.840413 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:33.841070 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:33.840548 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:33.841070 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:33.840602 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:33.995880 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:33.995841 2564 generic.go:358] "Generic (PLEG): container finished" podID="d9fa5381-4556-4f6c-91d2-c5b4580df414" containerID="dc30450bebcd331e3f4b18959d57bc16a806a20d90ea9a716e208c61e8a9ae6d" exitCode=0 Apr 21 02:41:33.995993 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:33.995876 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" event={"ID":"d9fa5381-4556-4f6c-91d2-c5b4580df414","Type":"ContainerDied","Data":"dc30450bebcd331e3f4b18959d57bc16a806a20d90ea9a716e208c61e8a9ae6d"} Apr 21 02:41:33.999020 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:33.999006 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:41:33.999382 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:33.999362 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" event={"ID":"5071ecdc-0d05-412f-b12d-1289b06373ec","Type":"ContainerStarted","Data":"ccc950df855613199c10506150387645ebf49b55c31ffba183040cd1a6fbd258"} Apr 21 02:41:34.000130 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:33.999620 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:34.000130 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:33.999642 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:34.000130 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:33.999651 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:34.013808 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:34.013783 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:34.013882 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:34.013856 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:41:34.037399 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:34.037365 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" podStartSLOduration=8.916527913 podStartE2EDuration="26.037353195s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.39735188 +0000 UTC m=+3.164840547" lastFinishedPulling="2026-04-21 02:41:27.518177157 +0000 UTC m=+20.285665829" observedRunningTime="2026-04-21 02:41:34.036800978 +0000 UTC m=+26.804289693" watchObservedRunningTime="2026-04-21 02:41:34.037353195 +0000 UTC m=+26.804841883" Apr 21 02:41:34.881364 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:34.881102 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-77bqp"] Apr 21 02:41:34.881784 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:34.881445 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:34.881784 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:34.881573 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:34.881784 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:34.881696 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5wsx7"] Apr 21 02:41:34.881946 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:34.881788 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:34.881946 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:34.881864 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:35.003309 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:35.003283 2564 generic.go:358] "Generic (PLEG): container finished" podID="d9fa5381-4556-4f6c-91d2-c5b4580df414" containerID="f601f2f0a595c454fa8af0bd5626582a22cc2b38423850b849f8b2aa16f3c7f5" exitCode=0 Apr 21 02:41:35.003444 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:35.003372 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" event={"ID":"d9fa5381-4556-4f6c-91d2-c5b4580df414","Type":"ContainerDied","Data":"f601f2f0a595c454fa8af0bd5626582a22cc2b38423850b849f8b2aa16f3c7f5"} Apr 21 02:41:36.007514 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:36.007461 2564 generic.go:358] "Generic (PLEG): container finished" podID="d9fa5381-4556-4f6c-91d2-c5b4580df414" containerID="d095c5441ec9113a03eab08940ed67c8016e1be62e9c6d2fd921ae9310ad9cec" exitCode=0 Apr 21 02:41:36.007941 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:36.007531 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" event={"ID":"d9fa5381-4556-4f6c-91d2-c5b4580df414","Type":"ContainerDied","Data":"d095c5441ec9113a03eab08940ed67c8016e1be62e9c6d2fd921ae9310ad9cec"} Apr 21 02:41:36.840651 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:36.840623 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:36.840799 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:36.840627 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:36.840799 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:36.840735 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:36.840916 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:36.840810 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:38.840509 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:38.840480 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:38.841252 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:38.840525 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:38.841252 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:38.840602 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5wsx7" podUID="91b77933-1a1d-4dc7-8c31-f6c98c4bea6e" Apr 21 02:41:38.841252 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:38.840759 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-77bqp" podUID="da0d17ad-bc94-4499-bb04-b7e0df549a24" Apr 21 02:41:40.553906 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.553876 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-170.ec2.internal" event="NodeReady" Apr 21 02:41:40.554384 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.554052 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 02:41:40.612812 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.612659 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4mh7f"] Apr 21 02:41:40.616995 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.616940 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wlfmp"] Apr 21 02:41:40.617137 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.617114 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.619474 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.619352 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 02:41:40.619474 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.619362 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6wzzj\"" Apr 21 02:41:40.619474 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.619377 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 02:41:40.619853 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.619829 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:40.621931 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.621912 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 02:41:40.621931 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.621917 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 02:41:40.622095 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.621952 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 02:41:40.622301 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.622283 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lm7rd\"" Apr 21 02:41:40.627330 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.627310 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wlfmp"] Apr 21 02:41:40.627422 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.627343 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4mh7f"] Apr 21 02:41:40.753301 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.753271 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65rrl\" (UniqueName: \"kubernetes.io/projected/44863844-48fa-48ba-82c7-863e1d932c72-kube-api-access-65rrl\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.753467 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.753316 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvl8p\" (UniqueName: \"kubernetes.io/projected/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-kube-api-access-qvl8p\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:40.753467 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.753354 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.753467 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.753397 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44863844-48fa-48ba-82c7-863e1d932c72-tmp-dir\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.753467 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.753417 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44863844-48fa-48ba-82c7-863e1d932c72-config-volume\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.753714 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.753482 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:40.840835 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.840807 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:40.841003 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.840807 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:40.843266 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.843241 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 02:41:40.843406 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.843349 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j2fpz\"" Apr 21 02:41:40.843625 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.843467 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 02:41:40.843625 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.843516 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 02:41:40.843625 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.843581 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nsvjd\"" Apr 21 02:41:40.854199 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.854176 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44863844-48fa-48ba-82c7-863e1d932c72-config-volume\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.854310 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.854234 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:40.854310 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.854257 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65rrl\" (UniqueName: \"kubernetes.io/projected/44863844-48fa-48ba-82c7-863e1d932c72-kube-api-access-65rrl\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.854310 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.854276 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvl8p\" (UniqueName: \"kubernetes.io/projected/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-kube-api-access-qvl8p\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:40.854310 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.854295 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.854556 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:40.854334 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:40.854556 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.854355 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44863844-48fa-48ba-82c7-863e1d932c72-tmp-dir\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.854656 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:40.854608 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert podName:c2a0c6c7-f298-43d3-a0c2-a33cca035a70 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:41.354581849 +0000 UTC m=+34.122070535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert") pod "ingress-canary-wlfmp" (UID: "c2a0c6c7-f298-43d3-a0c2-a33cca035a70") : secret "canary-serving-cert" not found Apr 21 02:41:40.854656 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:40.854617 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:40.854741 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:40.854660 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls podName:44863844-48fa-48ba-82c7-863e1d932c72 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:41.354646386 +0000 UTC m=+34.122135058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls") pod "dns-default-4mh7f" (UID: "44863844-48fa-48ba-82c7-863e1d932c72") : secret "dns-default-metrics-tls" not found Apr 21 02:41:40.854904 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.854882 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44863844-48fa-48ba-82c7-863e1d932c72-tmp-dir\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.854994 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.854977 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44863844-48fa-48ba-82c7-863e1d932c72-config-volume\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.865548 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.865527 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65rrl\" (UniqueName: \"kubernetes.io/projected/44863844-48fa-48ba-82c7-863e1d932c72-kube-api-access-65rrl\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:40.865714 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:40.865698 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvl8p\" (UniqueName: \"kubernetes.io/projected/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-kube-api-access-qvl8p\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:41.357834 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:41.357802 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:41.358027 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:41.357854 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:41.358027 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:41.357979 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:41.358155 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:41.357980 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:41.358155 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:41.358057 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls podName:44863844-48fa-48ba-82c7-863e1d932c72 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:42.358037478 +0000 UTC m=+35.125526151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls") pod "dns-default-4mh7f" (UID: "44863844-48fa-48ba-82c7-863e1d932c72") : secret "dns-default-metrics-tls" not found Apr 21 02:41:41.358155 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:41.358131 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert podName:c2a0c6c7-f298-43d3-a0c2-a33cca035a70 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:42.358105004 +0000 UTC m=+35.125593682 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert") pod "ingress-canary-wlfmp" (UID: "c2a0c6c7-f298-43d3-a0c2-a33cca035a70") : secret "canary-serving-cert" not found Apr 21 02:41:41.458303 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:41.458273 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:41:41.458478 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:41.458427 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 02:41:41.458565 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:41.458517 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs podName:da0d17ad-bc94-4499-bb04-b7e0df549a24 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:13.458482121 +0000 UTC m=+66.225970791 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs") pod "network-metrics-daemon-77bqp" (UID: "da0d17ad-bc94-4499-bb04-b7e0df549a24") : secret "metrics-daemon-secret" not found Apr 21 02:41:41.559352 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:41.559321 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcqh\" (UniqueName: \"kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh\") pod \"network-check-target-5wsx7\" (UID: \"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e\") " pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:41.562486 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:41.562459 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgcqh\" (UniqueName: \"kubernetes.io/projected/91b77933-1a1d-4dc7-8c31-f6c98c4bea6e-kube-api-access-vgcqh\") pod \"network-check-target-5wsx7\" (UID: \"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e\") " pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:41.752907 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:41.752884 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:41.910198 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:41.910014 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5wsx7"] Apr 21 02:41:41.984294 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:41:41.984271 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b77933_1a1d_4dc7_8c31_f6c98c4bea6e.slice/crio-ef65a7e173ea60ad60b1c0e66262c76887a817e94000524cf139547d3f3d3e43 WatchSource:0}: Error finding container ef65a7e173ea60ad60b1c0e66262c76887a817e94000524cf139547d3f3d3e43: Status 404 returned error can't find the container with id ef65a7e173ea60ad60b1c0e66262c76887a817e94000524cf139547d3f3d3e43 Apr 21 02:41:42.021101 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:42.021071 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5wsx7" event={"ID":"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e","Type":"ContainerStarted","Data":"ef65a7e173ea60ad60b1c0e66262c76887a817e94000524cf139547d3f3d3e43"} Apr 21 02:41:42.367716 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:42.367691 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:42.367833 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:42.367738 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:42.367883 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:42.367860 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:42.367914 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:42.367896 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:42.367946 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:42.367923 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert podName:c2a0c6c7-f298-43d3-a0c2-a33cca035a70 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:44.367908072 +0000 UTC m=+37.135396743 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert") pod "ingress-canary-wlfmp" (UID: "c2a0c6c7-f298-43d3-a0c2-a33cca035a70") : secret "canary-serving-cert" not found Apr 21 02:41:42.367985 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:42.367947 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls podName:44863844-48fa-48ba-82c7-863e1d932c72 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:44.367934838 +0000 UTC m=+37.135423510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls") pod "dns-default-4mh7f" (UID: "44863844-48fa-48ba-82c7-863e1d932c72") : secret "dns-default-metrics-tls" not found Apr 21 02:41:43.025775 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:43.025740 2564 generic.go:358] "Generic (PLEG): container finished" podID="d9fa5381-4556-4f6c-91d2-c5b4580df414" containerID="728a9fdd663cc790f5934b66dc67addab930286f36c51bd5d0a3bb40090acd2b" exitCode=0 Apr 21 02:41:43.026288 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:43.025793 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" event={"ID":"d9fa5381-4556-4f6c-91d2-c5b4580df414","Type":"ContainerDied","Data":"728a9fdd663cc790f5934b66dc67addab930286f36c51bd5d0a3bb40090acd2b"} Apr 21 02:41:44.030937 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:44.030888 2564 generic.go:358] "Generic (PLEG): container finished" podID="d9fa5381-4556-4f6c-91d2-c5b4580df414" containerID="dbe69da17907169b2a3ba3acffd8ad4356a15e1ff57f0ad003c7f737337e262e" exitCode=0 Apr 21 02:41:44.031343 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:44.030941 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" event={"ID":"d9fa5381-4556-4f6c-91d2-c5b4580df414","Type":"ContainerDied","Data":"dbe69da17907169b2a3ba3acffd8ad4356a15e1ff57f0ad003c7f737337e262e"} Apr 21 02:41:44.383893 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:44.383815 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:44.383893 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:44.383856 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:44.384080 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:44.383972 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:44.384080 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:44.384044 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert podName:c2a0c6c7-f298-43d3-a0c2-a33cca035a70 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:48.384025975 +0000 UTC m=+41.151514644 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert") pod "ingress-canary-wlfmp" (UID: "c2a0c6c7-f298-43d3-a0c2-a33cca035a70") : secret "canary-serving-cert" not found Apr 21 02:41:44.384214 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:44.383976 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:44.384214 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:44.384154 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls podName:44863844-48fa-48ba-82c7-863e1d932c72 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:48.384134215 +0000 UTC m=+41.151622898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls") pod "dns-default-4mh7f" (UID: "44863844-48fa-48ba-82c7-863e1d932c72") : secret "dns-default-metrics-tls" not found Apr 21 02:41:45.036412 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:45.036232 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" event={"ID":"d9fa5381-4556-4f6c-91d2-c5b4580df414","Type":"ContainerStarted","Data":"66f294674ae718378d86d758aa2d3c3a14a7379c99fea940e91c6d0aed904776"} Apr 21 02:41:45.037552 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:45.037530 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5wsx7" event={"ID":"91b77933-1a1d-4dc7-8c31-f6c98c4bea6e","Type":"ContainerStarted","Data":"5c1d62debd44413bb109498292abd9f383257767cea2d3c25d39ce9751095761"} Apr 21 02:41:45.037651 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:45.037641 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:41:45.056848 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:45.056790 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2vcn8" podStartSLOduration=6.415871986 podStartE2EDuration="38.056778755s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.390965212 +0000 UTC m=+3.158453883" lastFinishedPulling="2026-04-21 02:41:42.03187198 +0000 UTC m=+34.799360652" observedRunningTime="2026-04-21 02:41:45.055137279 +0000 UTC m=+37.822625968" watchObservedRunningTime="2026-04-21 02:41:45.056778755 +0000 UTC m=+37.824267443" Apr 21 02:41:48.410436 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:48.410404 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:48.410436 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:48.410440 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:48.410883 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:48.410569 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:48.410883 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:48.410623 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls podName:44863844-48fa-48ba-82c7-863e1d932c72 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:56.410609598 +0000 UTC m=+49.178098265 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls") pod "dns-default-4mh7f" (UID: "44863844-48fa-48ba-82c7-863e1d932c72") : secret "dns-default-metrics-tls" not found Apr 21 02:41:48.410883 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:48.410568 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:48.410883 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:48.410700 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert podName:c2a0c6c7-f298-43d3-a0c2-a33cca035a70 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:56.410686661 +0000 UTC m=+49.178175334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert") pod "ingress-canary-wlfmp" (UID: "c2a0c6c7-f298-43d3-a0c2-a33cca035a70") : secret "canary-serving-cert" not found Apr 21 02:41:56.459732 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:56.459700 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:41:56.459732 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:41:56.459743 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:41:56.460322 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:56.459841 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:56.460322 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:56.459856 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:56.460322 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:56.459900 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert podName:c2a0c6c7-f298-43d3-a0c2-a33cca035a70 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:12.459886545 +0000 UTC m=+65.227375211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert") pod "ingress-canary-wlfmp" (UID: "c2a0c6c7-f298-43d3-a0c2-a33cca035a70") : secret "canary-serving-cert" not found Apr 21 02:41:56.460322 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:41:56.459914 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls podName:44863844-48fa-48ba-82c7-863e1d932c72 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:12.459908191 +0000 UTC m=+65.227396858 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls") pod "dns-default-4mh7f" (UID: "44863844-48fa-48ba-82c7-863e1d932c72") : secret "dns-default-metrics-tls" not found Apr 21 02:42:06.018931 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:06.018902 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8b6t" Apr 21 02:42:06.043486 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:06.043438 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5wsx7" podStartSLOduration=55.144962342 podStartE2EDuration="58.043426011s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="2026-04-21 02:41:42.010259778 +0000 UTC m=+34.777748444" lastFinishedPulling="2026-04-21 02:41:44.908723445 +0000 UTC m=+37.676212113" observedRunningTime="2026-04-21 02:41:45.067591157 +0000 UTC m=+37.835079843" watchObservedRunningTime="2026-04-21 02:42:06.043426011 +0000 UTC m=+58.810914699" Apr 21 02:42:12.559923 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:12.559887 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:42:12.559923 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:12.559924 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:42:12.560315 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:12.560025 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:42:12.560315 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:12.560091 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls podName:44863844-48fa-48ba-82c7-863e1d932c72 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:44.560076564 +0000 UTC m=+97.327565231 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls") pod "dns-default-4mh7f" (UID: "44863844-48fa-48ba-82c7-863e1d932c72") : secret "dns-default-metrics-tls" not found Apr 21 02:42:12.560315 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:12.560025 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:42:12.560315 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:12.560143 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert podName:c2a0c6c7-f298-43d3-a0c2-a33cca035a70 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:44.56013239 +0000 UTC m=+97.327621103 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert") pod "ingress-canary-wlfmp" (UID: "c2a0c6c7-f298-43d3-a0c2-a33cca035a70") : secret "canary-serving-cert" not found Apr 21 02:42:13.466306 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:13.466277 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:42:13.466472 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:13.466391 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 02:42:13.466472 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:13.466445 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs podName:da0d17ad-bc94-4499-bb04-b7e0df549a24 nodeName:}" failed. No retries permitted until 2026-04-21 02:43:17.466428885 +0000 UTC m=+130.233917551 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs") pod "network-metrics-daemon-77bqp" (UID: "da0d17ad-bc94-4499-bb04-b7e0df549a24") : secret "metrics-daemon-secret" not found Apr 21 02:42:16.041790 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:16.041757 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5wsx7" Apr 21 02:42:17.093811 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.093778 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8rwzw"] Apr 21 02:42:17.100240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.100220 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.102698 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.102671 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 02:42:17.102787 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.102721 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 02:42:17.103438 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.103420 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 02:42:17.103550 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.103456 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-z55z6\"" Apr 21 02:42:17.103550 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.103473 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 02:42:17.107300 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.106788 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8rwzw"] Apr 21 02:42:17.108042 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.108018 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 02:42:17.187539 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.187515 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d5f030-7365-4e7d-92ef-15593dbe87f9-service-ca-bundle\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.187624 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.187565 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbk8q\" (UniqueName: \"kubernetes.io/projected/75d5f030-7365-4e7d-92ef-15593dbe87f9-kube-api-access-dbk8q\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.187624 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.187593 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d5f030-7365-4e7d-92ef-15593dbe87f9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.187624 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.187614 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/75d5f030-7365-4e7d-92ef-15593dbe87f9-snapshots\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.187722 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.187631 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d5f030-7365-4e7d-92ef-15593dbe87f9-serving-cert\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.187722 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.187670 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75d5f030-7365-4e7d-92ef-15593dbe87f9-tmp\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.194355 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.194333 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw"] Apr 21 02:42:17.197270 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.197257 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.199690 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.199670 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 02:42:17.199808 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.199780 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 02:42:17.199908 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.199837 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-gnr2k\"" Apr 21 02:42:17.199908 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.199835 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 02:42:17.199999 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.199949 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 02:42:17.203408 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.203390 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw"] Apr 21 02:42:17.288128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288106 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/75d5f030-7365-4e7d-92ef-15593dbe87f9-snapshots\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.288128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288135 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d5f030-7365-4e7d-92ef-15593dbe87f9-serving-cert\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.288306 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288157 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsnmn\" (UniqueName: \"kubernetes.io/projected/d2caecf8-8d57-4c32-a979-ebd9271526a5-kube-api-access-fsnmn\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.288306 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288188 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75d5f030-7365-4e7d-92ef-15593dbe87f9-tmp\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.288306 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288208 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d5f030-7365-4e7d-92ef-15593dbe87f9-service-ca-bundle\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.288306 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288226 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.288306 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288291 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d2caecf8-8d57-4c32-a979-ebd9271526a5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.288585 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288409 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbk8q\" (UniqueName: \"kubernetes.io/projected/75d5f030-7365-4e7d-92ef-15593dbe87f9-kube-api-access-dbk8q\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.288585 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288458 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d5f030-7365-4e7d-92ef-15593dbe87f9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.288784 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288766 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75d5f030-7365-4e7d-92ef-15593dbe87f9-tmp\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.288880 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/75d5f030-7365-4e7d-92ef-15593dbe87f9-snapshots\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.288934 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.288886 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d5f030-7365-4e7d-92ef-15593dbe87f9-service-ca-bundle\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.289182 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.289165 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75d5f030-7365-4e7d-92ef-15593dbe87f9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.292058 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.292043 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d5f030-7365-4e7d-92ef-15593dbe87f9-serving-cert\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.296259 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.296240 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbk8q\" (UniqueName: \"kubernetes.io/projected/75d5f030-7365-4e7d-92ef-15593dbe87f9-kube-api-access-dbk8q\") pod \"insights-operator-585dfdc468-8rwzw\" (UID: \"75d5f030-7365-4e7d-92ef-15593dbe87f9\") " pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.388731 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.388690 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsnmn\" (UniqueName: \"kubernetes.io/projected/d2caecf8-8d57-4c32-a979-ebd9271526a5-kube-api-access-fsnmn\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.388731 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.388722 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.388835 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.388751 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d2caecf8-8d57-4c32-a979-ebd9271526a5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.388835 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:17.388821 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:17.388901 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:17.388896 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls podName:d2caecf8-8d57-4c32-a979-ebd9271526a5 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:17.88888165 +0000 UTC m=+70.656370317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fh8kw" (UID: "d2caecf8-8d57-4c32-a979-ebd9271526a5") : secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:17.389310 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.389294 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d2caecf8-8d57-4c32-a979-ebd9271526a5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.396499 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.396483 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsnmn\" (UniqueName: \"kubernetes.io/projected/d2caecf8-8d57-4c32-a979-ebd9271526a5-kube-api-access-fsnmn\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.409352 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.409335 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8rwzw" Apr 21 02:42:17.534309 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.534277 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8rwzw"] Apr 21 02:42:17.538169 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:17.538141 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75d5f030_7365_4e7d_92ef_15593dbe87f9.slice/crio-c1aab5112023c2cdb0c1a422e47593ac27fb8ca77d16beded20f807a6aaa53f4 WatchSource:0}: Error finding container c1aab5112023c2cdb0c1a422e47593ac27fb8ca77d16beded20f807a6aaa53f4: Status 404 returned error can't find the container with id c1aab5112023c2cdb0c1a422e47593ac27fb8ca77d16beded20f807a6aaa53f4 Apr 21 02:42:17.895207 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:17.895180 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:17.895330 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:17.895314 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:17.895375 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:17.895370 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls podName:d2caecf8-8d57-4c32-a979-ebd9271526a5 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:18.895353805 +0000 UTC m=+71.662842475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fh8kw" (UID: "d2caecf8-8d57-4c32-a979-ebd9271526a5") : secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:18.100178 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:18.100151 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8rwzw" event={"ID":"75d5f030-7365-4e7d-92ef-15593dbe87f9","Type":"ContainerStarted","Data":"c1aab5112023c2cdb0c1a422e47593ac27fb8ca77d16beded20f807a6aaa53f4"} Apr 21 02:42:18.901216 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:18.901182 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:18.901393 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:18.901343 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:18.901451 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:18.901413 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls podName:d2caecf8-8d57-4c32-a979-ebd9271526a5 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:20.901397443 +0000 UTC m=+73.668886109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fh8kw" (UID: "d2caecf8-8d57-4c32-a979-ebd9271526a5") : secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:20.105361 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:20.105328 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8rwzw" event={"ID":"75d5f030-7365-4e7d-92ef-15593dbe87f9","Type":"ContainerStarted","Data":"71bd3b189c4d3556a48c9bb699fea76c02e8cf127ce9f7f7444a179782cc058e"} Apr 21 02:42:20.120422 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:20.120381 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-8rwzw" podStartSLOduration=0.986354466 podStartE2EDuration="3.120370012s" podCreationTimestamp="2026-04-21 02:42:17 +0000 UTC" firstStartedPulling="2026-04-21 02:42:17.541225452 +0000 UTC m=+70.308714132" lastFinishedPulling="2026-04-21 02:42:19.675240995 +0000 UTC m=+72.442729678" observedRunningTime="2026-04-21 02:42:20.119694168 +0000 UTC m=+72.887182858" watchObservedRunningTime="2026-04-21 02:42:20.120370012 +0000 UTC m=+72.887858700" Apr 21 02:42:20.913150 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:20.913112 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:20.913310 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:20.913224 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:20.913310 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:20.913281 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls podName:d2caecf8-8d57-4c32-a979-ebd9271526a5 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:24.913266527 +0000 UTC m=+77.680755194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fh8kw" (UID: "d2caecf8-8d57-4c32-a979-ebd9271526a5") : secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:22.955348 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:22.955230 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w659n_4a693e96-2533-450c-a8c8-de3f4cdfcd73/dns-node-resolver/0.log" Apr 21 02:42:23.355275 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:23.355248 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-f97sw_78ee76ed-3959-4c8b-8f2c-d057e4bd15db/node-ca/0.log" Apr 21 02:42:24.939609 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:24.939561 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:24.940011 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:24.939700 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:24.940011 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:24.939766 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls podName:d2caecf8-8d57-4c32-a979-ebd9271526a5 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:32.939750471 +0000 UTC m=+85.707239138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fh8kw" (UID: "d2caecf8-8d57-4c32-a979-ebd9271526a5") : secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:27.025834 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.025792 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7"] Apr 21 02:42:27.028883 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.028863 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7" Apr 21 02:42:27.031170 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.031139 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 02:42:27.031979 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.031960 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-z5rrx\"" Apr 21 02:42:27.032097 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.031989 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:42:27.035672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.035644 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7"] Apr 21 02:42:27.054901 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.054875 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd6f9\" (UniqueName: \"kubernetes.io/projected/f19baa1b-c32f-481f-bc3f-7a7906d3049e-kube-api-access-hd6f9\") pod \"volume-data-source-validator-7c6cbb6c87-gq5w7\" (UID: \"f19baa1b-c32f-481f-bc3f-7a7906d3049e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7" Apr 21 02:42:27.133236 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.133208 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz"] Apr 21 02:42:27.135953 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.135935 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.137995 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.137974 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 02:42:27.138124 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.138106 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:42:27.138174 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.138151 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 02:42:27.138221 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.138180 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 02:42:27.138373 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.138360 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-5m7zd\"" Apr 21 02:42:27.145820 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.145800 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz"] Apr 21 02:42:27.156142 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.156123 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cqv7\" (UniqueName: \"kubernetes.io/projected/385cbf98-cb51-4378-92b4-0aa2cdebc70f-kube-api-access-9cqv7\") pod \"service-ca-operator-d6fc45fc5-j7pzz\" (UID: \"385cbf98-cb51-4378-92b4-0aa2cdebc70f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.156221 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.156187 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd6f9\" (UniqueName: \"kubernetes.io/projected/f19baa1b-c32f-481f-bc3f-7a7906d3049e-kube-api-access-hd6f9\") pod \"volume-data-source-validator-7c6cbb6c87-gq5w7\" (UID: \"f19baa1b-c32f-481f-bc3f-7a7906d3049e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7" Apr 21 02:42:27.156260 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.156228 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/385cbf98-cb51-4378-92b4-0aa2cdebc70f-config\") pod \"service-ca-operator-d6fc45fc5-j7pzz\" (UID: \"385cbf98-cb51-4378-92b4-0aa2cdebc70f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.156297 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.156258 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/385cbf98-cb51-4378-92b4-0aa2cdebc70f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-j7pzz\" (UID: \"385cbf98-cb51-4378-92b4-0aa2cdebc70f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.174654 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.174635 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd6f9\" (UniqueName: \"kubernetes.io/projected/f19baa1b-c32f-481f-bc3f-7a7906d3049e-kube-api-access-hd6f9\") pod \"volume-data-source-validator-7c6cbb6c87-gq5w7\" (UID: \"f19baa1b-c32f-481f-bc3f-7a7906d3049e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7" Apr 21 02:42:27.257332 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.257310 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/385cbf98-cb51-4378-92b4-0aa2cdebc70f-config\") pod \"service-ca-operator-d6fc45fc5-j7pzz\" (UID: \"385cbf98-cb51-4378-92b4-0aa2cdebc70f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.257412 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.257339 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/385cbf98-cb51-4378-92b4-0aa2cdebc70f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-j7pzz\" (UID: \"385cbf98-cb51-4378-92b4-0aa2cdebc70f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.257412 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.257378 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cqv7\" (UniqueName: \"kubernetes.io/projected/385cbf98-cb51-4378-92b4-0aa2cdebc70f-kube-api-access-9cqv7\") pod \"service-ca-operator-d6fc45fc5-j7pzz\" (UID: \"385cbf98-cb51-4378-92b4-0aa2cdebc70f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.257877 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.257854 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/385cbf98-cb51-4378-92b4-0aa2cdebc70f-config\") pod \"service-ca-operator-d6fc45fc5-j7pzz\" (UID: \"385cbf98-cb51-4378-92b4-0aa2cdebc70f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.259423 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.259404 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/385cbf98-cb51-4378-92b4-0aa2cdebc70f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-j7pzz\" (UID: \"385cbf98-cb51-4378-92b4-0aa2cdebc70f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.264694 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.264672 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cqv7\" (UniqueName: \"kubernetes.io/projected/385cbf98-cb51-4378-92b4-0aa2cdebc70f-kube-api-access-9cqv7\") pod \"service-ca-operator-d6fc45fc5-j7pzz\" (UID: \"385cbf98-cb51-4378-92b4-0aa2cdebc70f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.338620 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.338566 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7" Apr 21 02:42:27.444237 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.444209 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" Apr 21 02:42:27.445383 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.445362 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7"] Apr 21 02:42:27.449156 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:27.449131 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf19baa1b_c32f_481f_bc3f_7a7906d3049e.slice/crio-b9224eff078b82db6125c6d2783fa095a6f0d71a80e07816ccb67a2322dbe801 WatchSource:0}: Error finding container b9224eff078b82db6125c6d2783fa095a6f0d71a80e07816ccb67a2322dbe801: Status 404 returned error can't find the container with id b9224eff078b82db6125c6d2783fa095a6f0d71a80e07816ccb67a2322dbe801 Apr 21 02:42:27.553882 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:27.553852 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz"] Apr 21 02:42:27.556669 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:27.556641 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod385cbf98_cb51_4378_92b4_0aa2cdebc70f.slice/crio-cc90474bd1f08968194e029b0f8ff99610df5ef28aaaee4c1ac667d976367039 WatchSource:0}: Error finding container cc90474bd1f08968194e029b0f8ff99610df5ef28aaaee4c1ac667d976367039: Status 404 returned error can't find the container with id cc90474bd1f08968194e029b0f8ff99610df5ef28aaaee4c1ac667d976367039 Apr 21 02:42:28.120958 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:28.120916 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7" event={"ID":"f19baa1b-c32f-481f-bc3f-7a7906d3049e","Type":"ContainerStarted","Data":"b9224eff078b82db6125c6d2783fa095a6f0d71a80e07816ccb67a2322dbe801"} Apr 21 02:42:28.122043 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:28.122014 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" event={"ID":"385cbf98-cb51-4378-92b4-0aa2cdebc70f","Type":"ContainerStarted","Data":"cc90474bd1f08968194e029b0f8ff99610df5ef28aaaee4c1ac667d976367039"} Apr 21 02:42:30.126327 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:30.126293 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7" event={"ID":"f19baa1b-c32f-481f-bc3f-7a7906d3049e","Type":"ContainerStarted","Data":"1cf58369999c832ee366f1458dd599913dd96e6eda090bcb041ba7d918843891"} Apr 21 02:42:30.127595 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:30.127553 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" event={"ID":"385cbf98-cb51-4378-92b4-0aa2cdebc70f","Type":"ContainerStarted","Data":"76b6ce6ed12368734a82e9bc222a02acd9ba212213635c375986a5d30b2af92b"} Apr 21 02:42:30.139056 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:30.139014 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gq5w7" podStartSLOduration=1.7289187849999998 podStartE2EDuration="3.139002204s" podCreationTimestamp="2026-04-21 02:42:27 +0000 UTC" firstStartedPulling="2026-04-21 02:42:27.450966342 +0000 UTC m=+80.218455008" lastFinishedPulling="2026-04-21 02:42:28.861049743 +0000 UTC m=+81.628538427" observedRunningTime="2026-04-21 02:42:30.138547992 +0000 UTC m=+82.906036681" watchObservedRunningTime="2026-04-21 02:42:30.139002204 +0000 UTC m=+82.906490894" Apr 21 02:42:30.151587 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:30.151552 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" podStartSLOduration=1.3225224789999999 podStartE2EDuration="3.151541452s" podCreationTimestamp="2026-04-21 02:42:27 +0000 UTC" firstStartedPulling="2026-04-21 02:42:27.5585955 +0000 UTC m=+80.326084170" lastFinishedPulling="2026-04-21 02:42:29.387614461 +0000 UTC m=+82.155103143" observedRunningTime="2026-04-21 02:42:30.15058514 +0000 UTC m=+82.918073829" watchObservedRunningTime="2026-04-21 02:42:30.151541452 +0000 UTC m=+82.919030137" Apr 21 02:42:32.374179 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.374146 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm"] Apr 21 02:42:32.377133 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.377118 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm" Apr 21 02:42:32.379369 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.379346 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 02:42:32.379369 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.379346 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 02:42:32.380175 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.380158 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-v2bm7\"" Apr 21 02:42:32.384191 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.384173 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm"] Apr 21 02:42:32.496126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.496103 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7whs6\" (UniqueName: \"kubernetes.io/projected/874628bd-c537-4903-b0dd-c66cce097f9e-kube-api-access-7whs6\") pod \"migrator-74bb7799d9-6lqvm\" (UID: \"874628bd-c537-4903-b0dd-c66cce097f9e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm" Apr 21 02:42:32.596818 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.596796 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7whs6\" (UniqueName: \"kubernetes.io/projected/874628bd-c537-4903-b0dd-c66cce097f9e-kube-api-access-7whs6\") pod \"migrator-74bb7799d9-6lqvm\" (UID: \"874628bd-c537-4903-b0dd-c66cce097f9e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm" Apr 21 02:42:32.604310 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.604287 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7whs6\" (UniqueName: \"kubernetes.io/projected/874628bd-c537-4903-b0dd-c66cce097f9e-kube-api-access-7whs6\") pod \"migrator-74bb7799d9-6lqvm\" (UID: \"874628bd-c537-4903-b0dd-c66cce097f9e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm" Apr 21 02:42:32.686391 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.686336 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm" Apr 21 02:42:32.793176 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:32.793149 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm"] Apr 21 02:42:32.797852 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:32.797825 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod874628bd_c537_4903_b0dd_c66cce097f9e.slice/crio-da3c85f43b0611b67a84eb28cedfbc3f5b36cd0292b4495fa3cf6d6a26ae6e0d WatchSource:0}: Error finding container da3c85f43b0611b67a84eb28cedfbc3f5b36cd0292b4495fa3cf6d6a26ae6e0d: Status 404 returned error can't find the container with id da3c85f43b0611b67a84eb28cedfbc3f5b36cd0292b4495fa3cf6d6a26ae6e0d Apr 21 02:42:33.000980 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:33.000959 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:33.001134 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:33.001115 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:33.001215 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:33.001203 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls podName:d2caecf8-8d57-4c32-a979-ebd9271526a5 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:49.001181191 +0000 UTC m=+101.768669891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fh8kw" (UID: "d2caecf8-8d57-4c32-a979-ebd9271526a5") : secret "cluster-monitoring-operator-tls" not found Apr 21 02:42:33.134229 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:33.134198 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm" event={"ID":"874628bd-c537-4903-b0dd-c66cce097f9e","Type":"ContainerStarted","Data":"da3c85f43b0611b67a84eb28cedfbc3f5b36cd0292b4495fa3cf6d6a26ae6e0d"} Apr 21 02:42:34.138568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:34.138537 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm" event={"ID":"874628bd-c537-4903-b0dd-c66cce097f9e","Type":"ContainerStarted","Data":"8d08ce249b957cc8493363429f00eac931a266b8e061013d8af2344f01279901"} Apr 21 02:42:34.138568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:34.138571 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm" event={"ID":"874628bd-c537-4903-b0dd-c66cce097f9e","Type":"ContainerStarted","Data":"97e2fb815edaa681ac3cd18ae2ea5f8bcebf4c8faf571e4b77197e4b659e79bd"} Apr 21 02:42:34.153071 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:34.153030 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6lqvm" podStartSLOduration=1.036240166 podStartE2EDuration="2.153017172s" podCreationTimestamp="2026-04-21 02:42:32 +0000 UTC" firstStartedPulling="2026-04-21 02:42:32.800098963 +0000 UTC m=+85.567587630" lastFinishedPulling="2026-04-21 02:42:33.916875955 +0000 UTC m=+86.684364636" observedRunningTime="2026-04-21 02:42:34.151993886 +0000 UTC m=+86.919482575" watchObservedRunningTime="2026-04-21 02:42:34.153017172 +0000 UTC m=+86.920505858" Apr 21 02:42:44.580837 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.580794 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:42:44.581182 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.580877 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:42:44.583237 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.583212 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a0c6c7-f298-43d3-a0c2-a33cca035a70-cert\") pod \"ingress-canary-wlfmp\" (UID: \"c2a0c6c7-f298-43d3-a0c2-a33cca035a70\") " pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:42:44.583237 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.583232 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44863844-48fa-48ba-82c7-863e1d932c72-metrics-tls\") pod \"dns-default-4mh7f\" (UID: \"44863844-48fa-48ba-82c7-863e1d932c72\") " pod="openshift-dns/dns-default-4mh7f" Apr 21 02:42:44.834269 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.834240 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6wzzj\"" Apr 21 02:42:44.838771 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.838743 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lm7rd\"" Apr 21 02:42:44.842662 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.842641 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4mh7f" Apr 21 02:42:44.847278 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.847256 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wlfmp" Apr 21 02:42:44.969164 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.969138 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wlfmp"] Apr 21 02:42:44.973060 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:44.973028 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a0c6c7_f298_43d3_a0c2_a33cca035a70.slice/crio-6ffc4cf2c7c82cb875821ff2fca57202027e00eedd7b1c1c9afa609b25513dbc WatchSource:0}: Error finding container 6ffc4cf2c7c82cb875821ff2fca57202027e00eedd7b1c1c9afa609b25513dbc: Status 404 returned error can't find the container with id 6ffc4cf2c7c82cb875821ff2fca57202027e00eedd7b1c1c9afa609b25513dbc Apr 21 02:42:44.980197 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:44.980178 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4mh7f"] Apr 21 02:42:44.982979 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:44.982948 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44863844_48fa_48ba_82c7_863e1d932c72.slice/crio-1e9dda88e4c32b1b90da0ca38937d09ea50f4e4943b6ceda45fc91225dd2dddb WatchSource:0}: Error finding container 1e9dda88e4c32b1b90da0ca38937d09ea50f4e4943b6ceda45fc91225dd2dddb: Status 404 returned error can't find the container with id 1e9dda88e4c32b1b90da0ca38937d09ea50f4e4943b6ceda45fc91225dd2dddb Apr 21 02:42:45.164841 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:45.164786 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wlfmp" event={"ID":"c2a0c6c7-f298-43d3-a0c2-a33cca035a70","Type":"ContainerStarted","Data":"6ffc4cf2c7c82cb875821ff2fca57202027e00eedd7b1c1c9afa609b25513dbc"} Apr 21 02:42:45.165641 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:45.165619 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4mh7f" event={"ID":"44863844-48fa-48ba-82c7-863e1d932c72","Type":"ContainerStarted","Data":"1e9dda88e4c32b1b90da0ca38937d09ea50f4e4943b6ceda45fc91225dd2dddb"} Apr 21 02:42:47.172860 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:47.172783 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4mh7f" event={"ID":"44863844-48fa-48ba-82c7-863e1d932c72","Type":"ContainerStarted","Data":"39cb9ed11eb897350c775ea44a723dafc401420cbe936906e7fecb92d87df19d"} Apr 21 02:42:47.172860 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:47.172822 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4mh7f" event={"ID":"44863844-48fa-48ba-82c7-863e1d932c72","Type":"ContainerStarted","Data":"093cf525a1fda29feb69855d5f451fa8ac6d166a5e3cd16252dc26657ba3659e"} Apr 21 02:42:47.173261 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:47.172883 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4mh7f" Apr 21 02:42:47.174549 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:47.174522 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wlfmp" event={"ID":"c2a0c6c7-f298-43d3-a0c2-a33cca035a70","Type":"ContainerStarted","Data":"af423528c3d5db6a6145b52513b9c1709f458711faa979e8cc5b954067e35cdd"} Apr 21 02:42:47.188022 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:47.187982 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4mh7f" podStartSLOduration=65.3076217 podStartE2EDuration="1m7.187956804s" podCreationTimestamp="2026-04-21 02:41:40 +0000 UTC" firstStartedPulling="2026-04-21 02:42:44.98451516 +0000 UTC m=+97.752003841" lastFinishedPulling="2026-04-21 02:42:46.864850278 +0000 UTC m=+99.632338945" observedRunningTime="2026-04-21 02:42:47.187226253 +0000 UTC m=+99.954714941" watchObservedRunningTime="2026-04-21 02:42:47.187956804 +0000 UTC m=+99.955445493" Apr 21 02:42:47.199860 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:47.199823 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wlfmp" podStartSLOduration=65.306222088 podStartE2EDuration="1m7.199813014s" podCreationTimestamp="2026-04-21 02:41:40 +0000 UTC" firstStartedPulling="2026-04-21 02:42:44.974821205 +0000 UTC m=+97.742309872" lastFinishedPulling="2026-04-21 02:42:46.868412125 +0000 UTC m=+99.635900798" observedRunningTime="2026-04-21 02:42:47.199625499 +0000 UTC m=+99.967114180" watchObservedRunningTime="2026-04-21 02:42:47.199813014 +0000 UTC m=+99.967301702" Apr 21 02:42:49.010599 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:49.010563 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:49.012888 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:49.012867 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2caecf8-8d57-4c32-a979-ebd9271526a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fh8kw\" (UID: \"d2caecf8-8d57-4c32-a979-ebd9271526a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:49.306123 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:49.306047 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" Apr 21 02:42:49.413823 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:49.413799 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw"] Apr 21 02:42:49.417802 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:49.417775 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2caecf8_8d57_4c32_a979_ebd9271526a5.slice/crio-118145bb09c1002d118b80dfa5bac123535f109007212be9f44a57f5ecbc1ba6 WatchSource:0}: Error finding container 118145bb09c1002d118b80dfa5bac123535f109007212be9f44a57f5ecbc1ba6: Status 404 returned error can't find the container with id 118145bb09c1002d118b80dfa5bac123535f109007212be9f44a57f5ecbc1ba6 Apr 21 02:42:50.184564 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:50.184530 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" event={"ID":"d2caecf8-8d57-4c32-a979-ebd9271526a5","Type":"ContainerStarted","Data":"118145bb09c1002d118b80dfa5bac123535f109007212be9f44a57f5ecbc1ba6"} Apr 21 02:42:51.188711 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:51.188671 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" event={"ID":"d2caecf8-8d57-4c32-a979-ebd9271526a5","Type":"ContainerStarted","Data":"77ff92f6d41c6c132e0b77f1f3afc728c7396c7d938d19b54f17538baf7d7722"} Apr 21 02:42:51.203243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:51.203191 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fh8kw" podStartSLOduration=32.625970808 podStartE2EDuration="34.203176451s" podCreationTimestamp="2026-04-21 02:42:17 +0000 UTC" firstStartedPulling="2026-04-21 02:42:49.419697511 +0000 UTC m=+102.187186191" lastFinishedPulling="2026-04-21 02:42:50.996903155 +0000 UTC m=+103.764391834" observedRunningTime="2026-04-21 02:42:51.202292267 +0000 UTC m=+103.969780957" watchObservedRunningTime="2026-04-21 02:42:51.203176451 +0000 UTC m=+103.970665141" Apr 21 02:42:54.882406 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.882371 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-555fd6467b-sbrs4"] Apr 21 02:42:54.885397 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.885381 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:54.887877 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.887851 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 02:42:54.887877 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.887872 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 02:42:54.888085 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.887879 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 02:42:54.888085 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.887932 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nlcjx\"" Apr 21 02:42:54.892677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.892658 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 02:42:54.895179 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.895156 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-555fd6467b-sbrs4"] Apr 21 02:42:54.936512 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.936472 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-555fd6467b-sbrs4"] Apr 21 02:42:54.936639 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:54.936619 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-vj485 registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" podUID="1009b217-f91e-4a17-8332-bf2d0ebc75aa" Apr 21 02:42:54.949888 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.949868 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-bound-sa-token\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:54.949982 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.949907 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-trusted-ca\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:54.949982 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.949926 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-image-registry-private-configuration\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:54.949982 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.949943 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-installation-pull-secrets\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:54.950110 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.950075 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj485\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-kube-api-access-vj485\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:54.950110 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.950106 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-tls\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:54.950201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.950120 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-certificates\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:54.950201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.950144 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1009b217-f91e-4a17-8332-bf2d0ebc75aa-ca-trust-extracted\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:54.978029 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.977994 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h"] Apr 21 02:42:54.980994 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.980978 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h" Apr 21 02:42:54.986167 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.986148 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-rs5mv\"" Apr 21 02:42:54.988003 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.987987 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-8zx8l"] Apr 21 02:42:54.990770 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.990754 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8lx74"] Apr 21 02:42:54.990894 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.990881 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8zx8l" Apr 21 02:42:54.993368 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.993354 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-z6lqg"] Apr 21 02:42:54.993478 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.993466 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" Apr 21 02:42:54.996386 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.996237 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:54.998985 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.998967 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-tr6ml\"" Apr 21 02:42:54.999413 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.999221 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 02:42:54.999413 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.999405 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 02:42:54.999632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.999599 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h"] Apr 21 02:42:54.999736 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:54.999684 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 02:42:55.000832 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.000754 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s6bnd\"" Apr 21 02:42:55.000832 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.000778 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vvtz4\"" Apr 21 02:42:55.000832 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.000793 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 02:42:55.001059 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.001041 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 02:42:55.001687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.001673 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 02:42:55.013106 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.013089 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8lx74"] Apr 21 02:42:55.019421 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.019400 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8zx8l"] Apr 21 02:42:55.038965 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.038944 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z6lqg"] Apr 21 02:42:55.051108 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051082 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7f22dea5-13ae-41ea-8a44-0677d956ef0b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.051261 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051117 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-bound-sa-token\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.051261 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051146 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-installation-pull-secrets\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.051261 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051185 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-tls\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.051261 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051214 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8lx74\" (UID: \"c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" Apr 21 02:42:55.051261 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051243 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8lx74\" (UID: \"c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" Apr 21 02:42:55.051457 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051274 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1009b217-f91e-4a17-8332-bf2d0ebc75aa-ca-trust-extracted\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.051457 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051329 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7f22dea5-13ae-41ea-8a44-0677d956ef0b-data-volume\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.051457 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051387 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5t84\" (UniqueName: \"kubernetes.io/projected/56684ff1-c656-4b66-8fa3-541d09278ff9-kube-api-access-f5t84\") pod \"downloads-6bcc868b7-8zx8l\" (UID: \"56684ff1-c656-4b66-8fa3-541d09278ff9\") " pod="openshift-console/downloads-6bcc868b7-8zx8l" Apr 21 02:42:55.051457 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051431 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-trusted-ca\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.051672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051512 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7f22dea5-13ae-41ea-8a44-0677d956ef0b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.051672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051561 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-image-registry-private-configuration\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.051672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051592 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj485\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-kube-api-access-vj485\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.051672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051597 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1009b217-f91e-4a17-8332-bf2d0ebc75aa-ca-trust-extracted\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.051672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051626 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-certificates\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.051672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051653 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4lxj\" (UniqueName: \"kubernetes.io/projected/e9dd6f13-133f-4d41-ae44-e3221c8d6b70-kube-api-access-n4lxj\") pod \"network-check-source-8894fc9bd-crw5h\" (UID: \"e9dd6f13-133f-4d41-ae44-e3221c8d6b70\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h" Apr 21 02:42:55.051672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051672 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsxck\" (UniqueName: \"kubernetes.io/projected/7f22dea5-13ae-41ea-8a44-0677d956ef0b-kube-api-access-nsxck\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.052015 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.051710 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7f22dea5-13ae-41ea-8a44-0677d956ef0b-crio-socket\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.052373 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.052352 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-trusted-ca\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.052443 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.052418 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-certificates\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.053574 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.053553 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-installation-pull-secrets\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.053678 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.053658 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-tls\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.053900 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.053884 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-image-registry-private-configuration\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.071148 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.071126 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-bound-sa-token\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.071267 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.071242 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj485\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-kube-api-access-vj485\") pod \"image-registry-555fd6467b-sbrs4\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.152643 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152591 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4lxj\" (UniqueName: \"kubernetes.io/projected/e9dd6f13-133f-4d41-ae44-e3221c8d6b70-kube-api-access-n4lxj\") pod \"network-check-source-8894fc9bd-crw5h\" (UID: \"e9dd6f13-133f-4d41-ae44-e3221c8d6b70\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h" Apr 21 02:42:55.152643 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152620 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsxck\" (UniqueName: \"kubernetes.io/projected/7f22dea5-13ae-41ea-8a44-0677d956ef0b-kube-api-access-nsxck\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.152813 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152644 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7f22dea5-13ae-41ea-8a44-0677d956ef0b-crio-socket\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.152813 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152661 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7f22dea5-13ae-41ea-8a44-0677d956ef0b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.152813 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152685 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8lx74\" (UID: \"c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" Apr 21 02:42:55.152813 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152708 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8lx74\" (UID: \"c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" Apr 21 02:42:55.152813 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152737 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7f22dea5-13ae-41ea-8a44-0677d956ef0b-data-volume\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.152813 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152750 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7f22dea5-13ae-41ea-8a44-0677d956ef0b-crio-socket\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.152813 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152769 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5t84\" (UniqueName: \"kubernetes.io/projected/56684ff1-c656-4b66-8fa3-541d09278ff9-kube-api-access-f5t84\") pod \"downloads-6bcc868b7-8zx8l\" (UID: \"56684ff1-c656-4b66-8fa3-541d09278ff9\") " pod="openshift-console/downloads-6bcc868b7-8zx8l" Apr 21 02:42:55.152813 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.152809 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7f22dea5-13ae-41ea-8a44-0677d956ef0b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.153191 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.153071 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7f22dea5-13ae-41ea-8a44-0677d956ef0b-data-volume\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.153279 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.153257 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7f22dea5-13ae-41ea-8a44-0677d956ef0b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.153467 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.153447 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8lx74\" (UID: \"c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" Apr 21 02:42:55.154882 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.154863 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7f22dea5-13ae-41ea-8a44-0677d956ef0b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.154882 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.154872 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8lx74\" (UID: \"c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" Apr 21 02:42:55.161551 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.161529 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsxck\" (UniqueName: \"kubernetes.io/projected/7f22dea5-13ae-41ea-8a44-0677d956ef0b-kube-api-access-nsxck\") pod \"insights-runtime-extractor-z6lqg\" (UID: \"7f22dea5-13ae-41ea-8a44-0677d956ef0b\") " pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.161854 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.161838 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5t84\" (UniqueName: \"kubernetes.io/projected/56684ff1-c656-4b66-8fa3-541d09278ff9-kube-api-access-f5t84\") pod \"downloads-6bcc868b7-8zx8l\" (UID: \"56684ff1-c656-4b66-8fa3-541d09278ff9\") " pod="openshift-console/downloads-6bcc868b7-8zx8l" Apr 21 02:42:55.161902 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.161886 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4lxj\" (UniqueName: \"kubernetes.io/projected/e9dd6f13-133f-4d41-ae44-e3221c8d6b70-kube-api-access-n4lxj\") pod \"network-check-source-8894fc9bd-crw5h\" (UID: \"e9dd6f13-133f-4d41-ae44-e3221c8d6b70\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h" Apr 21 02:42:55.197160 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.197137 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.201114 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.201098 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:55.253244 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253224 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1009b217-f91e-4a17-8332-bf2d0ebc75aa-ca-trust-extracted\") pod \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " Apr 21 02:42:55.253330 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253252 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-trusted-ca\") pod \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " Apr 21 02:42:55.253330 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253272 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-installation-pull-secrets\") pod \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " Apr 21 02:42:55.253330 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253304 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-tls\") pod \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " Apr 21 02:42:55.253447 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253345 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj485\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-kube-api-access-vj485\") pod \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " Apr 21 02:42:55.253447 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253375 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-image-registry-private-configuration\") pod \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " Apr 21 02:42:55.253447 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253409 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-bound-sa-token\") pod \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " Apr 21 02:42:55.253615 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253445 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-certificates\") pod \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\" (UID: \"1009b217-f91e-4a17-8332-bf2d0ebc75aa\") " Apr 21 02:42:55.253615 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253481 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1009b217-f91e-4a17-8332-bf2d0ebc75aa-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1009b217-f91e-4a17-8332-bf2d0ebc75aa" (UID: "1009b217-f91e-4a17-8332-bf2d0ebc75aa"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:42:55.253710 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253682 2564 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1009b217-f91e-4a17-8332-bf2d0ebc75aa-ca-trust-extracted\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:42:55.253790 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253766 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1009b217-f91e-4a17-8332-bf2d0ebc75aa" (UID: "1009b217-f91e-4a17-8332-bf2d0ebc75aa"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:42:55.253869 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.253857 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1009b217-f91e-4a17-8332-bf2d0ebc75aa" (UID: "1009b217-f91e-4a17-8332-bf2d0ebc75aa"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:42:55.255490 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.255456 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1009b217-f91e-4a17-8332-bf2d0ebc75aa" (UID: "1009b217-f91e-4a17-8332-bf2d0ebc75aa"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:42:55.255490 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.255480 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-kube-api-access-vj485" (OuterVolumeSpecName: "kube-api-access-vj485") pod "1009b217-f91e-4a17-8332-bf2d0ebc75aa" (UID: "1009b217-f91e-4a17-8332-bf2d0ebc75aa"). InnerVolumeSpecName "kube-api-access-vj485". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:42:55.255637 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.255470 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1009b217-f91e-4a17-8332-bf2d0ebc75aa" (UID: "1009b217-f91e-4a17-8332-bf2d0ebc75aa"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:42:55.255732 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.255714 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1009b217-f91e-4a17-8332-bf2d0ebc75aa" (UID: "1009b217-f91e-4a17-8332-bf2d0ebc75aa"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:42:55.255779 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.255761 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1009b217-f91e-4a17-8332-bf2d0ebc75aa" (UID: "1009b217-f91e-4a17-8332-bf2d0ebc75aa"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:42:55.288959 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.288942 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h" Apr 21 02:42:55.299569 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.299551 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8zx8l" Apr 21 02:42:55.307270 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.307252 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" Apr 21 02:42:55.311833 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.311807 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z6lqg" Apr 21 02:42:55.354305 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.354268 2564 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-image-registry-private-configuration\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:42:55.354556 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.354310 2564 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-bound-sa-token\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:42:55.354556 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.354329 2564 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-certificates\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:42:55.354556 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.354345 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1009b217-f91e-4a17-8332-bf2d0ebc75aa-trusted-ca\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:42:55.354556 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.354358 2564 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1009b217-f91e-4a17-8332-bf2d0ebc75aa-installation-pull-secrets\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:42:55.354556 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.354371 2564 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-registry-tls\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:42:55.354556 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.354385 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vj485\" (UniqueName: \"kubernetes.io/projected/1009b217-f91e-4a17-8332-bf2d0ebc75aa-kube-api-access-vj485\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:42:55.453267 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.453209 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h"] Apr 21 02:42:55.457749 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:55.457720 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9dd6f13_133f_4d41_ae44_e3221c8d6b70.slice/crio-98a9320620b6567d44adaeb25f4aaccb24c2b9d276b4546e6f30f92dfe9e96af WatchSource:0}: Error finding container 98a9320620b6567d44adaeb25f4aaccb24c2b9d276b4546e6f30f92dfe9e96af: Status 404 returned error can't find the container with id 98a9320620b6567d44adaeb25f4aaccb24c2b9d276b4546e6f30f92dfe9e96af Apr 21 02:42:55.468250 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.467799 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8zx8l"] Apr 21 02:42:55.472379 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:55.472355 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56684ff1_c656_4b66_8fa3_541d09278ff9.slice/crio-ccc926d821c2786a91afcf4e9fe47f224859dd1c5b3733be889da7061331b534 WatchSource:0}: Error finding container ccc926d821c2786a91afcf4e9fe47f224859dd1c5b3733be889da7061331b534: Status 404 returned error can't find the container with id ccc926d821c2786a91afcf4e9fe47f224859dd1c5b3733be889da7061331b534 Apr 21 02:42:55.481402 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.481346 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8lx74"] Apr 21 02:42:55.490692 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:55.490664 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9cd2e96_ef84_49dd_b7ca_9fb0cc36aab2.slice/crio-f10c06b35036457e6715e0a9f9e2568ef2ba1c8d534a1267b6b0635baa59329e WatchSource:0}: Error finding container f10c06b35036457e6715e0a9f9e2568ef2ba1c8d534a1267b6b0635baa59329e: Status 404 returned error can't find the container with id f10c06b35036457e6715e0a9f9e2568ef2ba1c8d534a1267b6b0635baa59329e Apr 21 02:42:55.493224 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:55.493208 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z6lqg"] Apr 21 02:42:55.496482 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:55.496460 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f22dea5_13ae_41ea_8a44_0677d956ef0b.slice/crio-c737fd31847bdc5d1716bfc30d72a0e24cd04a23a404ab7ff3769c29bc6d680d WatchSource:0}: Error finding container c737fd31847bdc5d1716bfc30d72a0e24cd04a23a404ab7ff3769c29bc6d680d: Status 404 returned error can't find the container with id c737fd31847bdc5d1716bfc30d72a0e24cd04a23a404ab7ff3769c29bc6d680d Apr 21 02:42:56.203069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.202998 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z6lqg" event={"ID":"7f22dea5-13ae-41ea-8a44-0677d956ef0b","Type":"ContainerStarted","Data":"dc811c6cbf12c22da302a9ee064f7e261fb654c0328893679397e9942d38f394"} Apr 21 02:42:56.203069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.203038 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z6lqg" event={"ID":"7f22dea5-13ae-41ea-8a44-0677d956ef0b","Type":"ContainerStarted","Data":"efb752613a6e42df09dac8de99cdbcd61b27ac0ddc5b89425baeb8a6fd52e20b"} Apr 21 02:42:56.203069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.203051 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z6lqg" event={"ID":"7f22dea5-13ae-41ea-8a44-0677d956ef0b","Type":"ContainerStarted","Data":"c737fd31847bdc5d1716bfc30d72a0e24cd04a23a404ab7ff3769c29bc6d680d"} Apr 21 02:42:56.204108 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.204085 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" event={"ID":"c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2","Type":"ContainerStarted","Data":"f10c06b35036457e6715e0a9f9e2568ef2ba1c8d534a1267b6b0635baa59329e"} Apr 21 02:42:56.205695 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.205573 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h" event={"ID":"e9dd6f13-133f-4d41-ae44-e3221c8d6b70","Type":"ContainerStarted","Data":"201807a5df7f5fe405a0fe65a7f01322da28e47458b9cbe013c96ec9659d4e99"} Apr 21 02:42:56.205695 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.205601 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h" event={"ID":"e9dd6f13-133f-4d41-ae44-e3221c8d6b70","Type":"ContainerStarted","Data":"98a9320620b6567d44adaeb25f4aaccb24c2b9d276b4546e6f30f92dfe9e96af"} Apr 21 02:42:56.206948 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.206908 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8zx8l" event={"ID":"56684ff1-c656-4b66-8fa3-541d09278ff9","Type":"ContainerStarted","Data":"ccc926d821c2786a91afcf4e9fe47f224859dd1c5b3733be889da7061331b534"} Apr 21 02:42:56.207051 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.206947 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-555fd6467b-sbrs4" Apr 21 02:42:56.220169 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.220123 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-crw5h" podStartSLOduration=2.220109735 podStartE2EDuration="2.220109735s" podCreationTimestamp="2026-04-21 02:42:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:42:56.218525166 +0000 UTC m=+108.986013860" watchObservedRunningTime="2026-04-21 02:42:56.220109735 +0000 UTC m=+108.987598424" Apr 21 02:42:56.242756 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.242726 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-555fd6467b-sbrs4"] Apr 21 02:42:56.246595 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:56.246550 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-555fd6467b-sbrs4"] Apr 21 02:42:57.179964 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.179938 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4mh7f" Apr 21 02:42:57.212380 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.212343 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" event={"ID":"c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2","Type":"ContainerStarted","Data":"c82635ecb62ba59acc0a141f24aaf56223de34208133a504209c73f1d5ce92ea"} Apr 21 02:42:57.530594 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.528031 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8lx74" podStartSLOduration=2.519925499 podStartE2EDuration="3.528012668s" podCreationTimestamp="2026-04-21 02:42:54 +0000 UTC" firstStartedPulling="2026-04-21 02:42:55.492590381 +0000 UTC m=+108.260079048" lastFinishedPulling="2026-04-21 02:42:56.500677534 +0000 UTC m=+109.268166217" observedRunningTime="2026-04-21 02:42:57.226249859 +0000 UTC m=+109.993738548" watchObservedRunningTime="2026-04-21 02:42:57.528012668 +0000 UTC m=+110.295501356" Apr 21 02:42:57.530594 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.528949 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9dm6g"] Apr 21 02:42:57.534322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.534294 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.536713 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.536686 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 02:42:57.536859 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.536693 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 02:42:57.536859 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.536818 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 02:42:57.537025 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.536918 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-bnpxw\"" Apr 21 02:42:57.540873 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.540853 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9dm6g"] Apr 21 02:42:57.671640 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.671416 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/241f3ec9-1e35-4441-b5c9-a053f2a3307c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.671640 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.671468 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/241f3ec9-1e35-4441-b5c9-a053f2a3307c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.671640 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.671524 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xrrj\" (UniqueName: \"kubernetes.io/projected/241f3ec9-1e35-4441-b5c9-a053f2a3307c-kube-api-access-5xrrj\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.671640 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.671557 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/241f3ec9-1e35-4441-b5c9-a053f2a3307c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.772461 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.772435 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/241f3ec9-1e35-4441-b5c9-a053f2a3307c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.772596 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.772480 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/241f3ec9-1e35-4441-b5c9-a053f2a3307c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.772596 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.772523 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xrrj\" (UniqueName: \"kubernetes.io/projected/241f3ec9-1e35-4441-b5c9-a053f2a3307c-kube-api-access-5xrrj\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.772596 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.772555 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/241f3ec9-1e35-4441-b5c9-a053f2a3307c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.772748 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:57.772618 2564 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 02:42:57.772748 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:42:57.772683 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/241f3ec9-1e35-4441-b5c9-a053f2a3307c-prometheus-operator-tls podName:241f3ec9-1e35-4441-b5c9-a053f2a3307c nodeName:}" failed. No retries permitted until 2026-04-21 02:42:58.272662763 +0000 UTC m=+111.040151434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/241f3ec9-1e35-4441-b5c9-a053f2a3307c-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-9dm6g" (UID: "241f3ec9-1e35-4441-b5c9-a053f2a3307c") : secret "prometheus-operator-tls" not found Apr 21 02:42:57.773863 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.773845 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/241f3ec9-1e35-4441-b5c9-a053f2a3307c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.775127 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.775106 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/241f3ec9-1e35-4441-b5c9-a053f2a3307c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.781211 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.781139 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xrrj\" (UniqueName: \"kubernetes.io/projected/241f3ec9-1e35-4441-b5c9-a053f2a3307c-kube-api-access-5xrrj\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:57.845224 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.845187 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1009b217-f91e-4a17-8332-bf2d0ebc75aa" path="/var/lib/kubelet/pods/1009b217-f91e-4a17-8332-bf2d0ebc75aa/volumes" Apr 21 02:42:57.949692 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.949667 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-554b4c9b8b-xbtp7"] Apr 21 02:42:57.953213 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.953191 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:57.955531 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.955375 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 02:42:57.955531 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.955382 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 02:42:57.955531 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.955482 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 02:42:57.955740 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.955386 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 02:42:57.956112 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.956090 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-9whnt\"" Apr 21 02:42:57.956214 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.956095 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 02:42:57.964967 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:57.964893 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-554b4c9b8b-xbtp7"] Apr 21 02:42:58.074588 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.074511 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-serving-cert\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.074755 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.074609 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-console-config\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.074755 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.074638 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-oauth-config\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.074755 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.074661 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-service-ca\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.074755 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.074688 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-oauth-serving-cert\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.075003 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.074793 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jj74\" (UniqueName: \"kubernetes.io/projected/146fade0-ee27-4353-b36c-7e31717edebb-kube-api-access-6jj74\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.176115 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.176085 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jj74\" (UniqueName: \"kubernetes.io/projected/146fade0-ee27-4353-b36c-7e31717edebb-kube-api-access-6jj74\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.176270 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.176127 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-serving-cert\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.176270 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.176244 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-console-config\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.176396 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.176279 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-oauth-config\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.176396 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.176306 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-service-ca\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.176396 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.176339 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-oauth-serving-cert\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.177004 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.176977 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-console-config\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.177100 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.177060 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-oauth-serving-cert\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.177181 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.177144 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-service-ca\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.178859 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.178832 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-serving-cert\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.179027 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.179010 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-oauth-config\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.183553 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.183532 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jj74\" (UniqueName: \"kubernetes.io/projected/146fade0-ee27-4353-b36c-7e31717edebb-kube-api-access-6jj74\") pod \"console-554b4c9b8b-xbtp7\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.217993 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.217958 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z6lqg" event={"ID":"7f22dea5-13ae-41ea-8a44-0677d956ef0b","Type":"ContainerStarted","Data":"ff3eb0e9f57f104f73ebde59607930cdc59a9bda7c171deb5572de20a6dbd832"} Apr 21 02:42:58.234908 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.234866 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-z6lqg" podStartSLOduration=2.146628142 podStartE2EDuration="4.234854182s" podCreationTimestamp="2026-04-21 02:42:54 +0000 UTC" firstStartedPulling="2026-04-21 02:42:55.538234858 +0000 UTC m=+108.305723527" lastFinishedPulling="2026-04-21 02:42:57.626460888 +0000 UTC m=+110.393949567" observedRunningTime="2026-04-21 02:42:58.23340121 +0000 UTC m=+111.000889904" watchObservedRunningTime="2026-04-21 02:42:58.234854182 +0000 UTC m=+111.002342870" Apr 21 02:42:58.265142 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.265116 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:42:58.277020 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.276993 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/241f3ec9-1e35-4441-b5c9-a053f2a3307c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:58.279677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.279654 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/241f3ec9-1e35-4441-b5c9-a053f2a3307c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9dm6g\" (UID: \"241f3ec9-1e35-4441-b5c9-a053f2a3307c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:58.393490 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.393463 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-554b4c9b8b-xbtp7"] Apr 21 02:42:58.395821 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:58.395792 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146fade0_ee27_4353_b36c_7e31717edebb.slice/crio-edb4d80d957a4c24c187b683cede32d5cbc30a24817f5565889547c3fefe1d55 WatchSource:0}: Error finding container edb4d80d957a4c24c187b683cede32d5cbc30a24817f5565889547c3fefe1d55: Status 404 returned error can't find the container with id edb4d80d957a4c24c187b683cede32d5cbc30a24817f5565889547c3fefe1d55 Apr 21 02:42:58.446313 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.446290 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" Apr 21 02:42:58.577636 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:58.577605 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9dm6g"] Apr 21 02:42:58.581421 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:42:58.581393 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241f3ec9_1e35_4441_b5c9_a053f2a3307c.slice/crio-d357d57190cbc9c4e5616cfa2ebbb80255a562a02df56e6640257078c0f83399 WatchSource:0}: Error finding container d357d57190cbc9c4e5616cfa2ebbb80255a562a02df56e6640257078c0f83399: Status 404 returned error can't find the container with id d357d57190cbc9c4e5616cfa2ebbb80255a562a02df56e6640257078c0f83399 Apr 21 02:42:59.226196 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:59.226150 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" event={"ID":"241f3ec9-1e35-4441-b5c9-a053f2a3307c","Type":"ContainerStarted","Data":"d357d57190cbc9c4e5616cfa2ebbb80255a562a02df56e6640257078c0f83399"} Apr 21 02:42:59.230148 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:42:59.230117 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554b4c9b8b-xbtp7" event={"ID":"146fade0-ee27-4353-b36c-7e31717edebb","Type":"ContainerStarted","Data":"edb4d80d957a4c24c187b683cede32d5cbc30a24817f5565889547c3fefe1d55"} Apr 21 02:43:00.236110 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:00.235344 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" event={"ID":"241f3ec9-1e35-4441-b5c9-a053f2a3307c","Type":"ContainerStarted","Data":"a95056756bd5df4d31d3d87d088f8db707222276efe640813d6f9e80489f381d"} Apr 21 02:43:00.236110 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:00.235391 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" event={"ID":"241f3ec9-1e35-4441-b5c9-a053f2a3307c","Type":"ContainerStarted","Data":"dcb6275606cb3873958bfcb3e3ce8c84556a7f96894d16c91c7608bbcf6acbba"} Apr 21 02:43:00.252100 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:00.252050 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dm6g" podStartSLOduration=1.752574396 podStartE2EDuration="3.252031407s" podCreationTimestamp="2026-04-21 02:42:57 +0000 UTC" firstStartedPulling="2026-04-21 02:42:58.583544078 +0000 UTC m=+111.351032756" lastFinishedPulling="2026-04-21 02:43:00.083001088 +0000 UTC m=+112.850489767" observedRunningTime="2026-04-21 02:43:00.250473965 +0000 UTC m=+113.017962655" watchObservedRunningTime="2026-04-21 02:43:00.252031407 +0000 UTC m=+113.019520099" Apr 21 02:43:01.858610 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.858586 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7"] Apr 21 02:43:01.863037 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.863015 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:01.865955 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.865932 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 02:43:01.866059 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.866045 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-lgcws\"" Apr 21 02:43:01.866118 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.866105 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 02:43:01.887182 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.887141 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7"] Apr 21 02:43:01.888287 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.888265 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xzcxn"] Apr 21 02:43:01.891406 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.891387 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6v2w9"] Apr 21 02:43:01.891608 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.891590 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:01.894483 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.894459 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-bznrr\"" Apr 21 02:43:01.894607 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.894567 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:01.894670 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.894653 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 02:43:01.894823 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.894802 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 02:43:01.895398 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.895383 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 02:43:01.897795 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.897596 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 02:43:01.897795 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.897623 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-blx9s\"" Apr 21 02:43:01.898644 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.898625 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 02:43:01.898968 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.898949 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 02:43:01.908172 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:01.908121 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xzcxn"] Apr 21 02:43:02.009520 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009474 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.009680 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009579 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhzf2\" (UniqueName: \"kubernetes.io/projected/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-api-access-bhzf2\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.009680 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009605 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-wtmp\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.009680 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009655 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.009825 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009740 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b71279ec-544a-4616-926a-e46f30b44e79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.009825 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009788 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a19ff527-8170-4894-b42e-82cf578a53b8-metrics-client-ca\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.009911 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009833 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bthcv\" (UniqueName: \"kubernetes.io/projected/a19ff527-8170-4894-b42e-82cf578a53b8-kube-api-access-bthcv\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.009911 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009873 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.009911 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009897 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.010023 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009928 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a19ff527-8170-4894-b42e-82cf578a53b8-root\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.010023 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.009946 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b71279ec-544a-4616-926a-e46f30b44e79-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.010142 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.010022 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.010142 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.010082 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-tls\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.010142 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.010114 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-textfile\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.010565 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.010154 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.010634 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.010605 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a19ff527-8170-4894-b42e-82cf578a53b8-sys\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.010700 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.010662 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-accelerators-collector-config\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.010754 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.010701 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b71279ec-544a-4616-926a-e46f30b44e79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.010828 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.010768 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcmh\" (UniqueName: \"kubernetes.io/projected/b71279ec-544a-4616-926a-e46f30b44e79-kube-api-access-4wcmh\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.111583 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111488 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b71279ec-544a-4616-926a-e46f30b44e79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.111583 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111545 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a19ff527-8170-4894-b42e-82cf578a53b8-metrics-client-ca\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.111583 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111578 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bthcv\" (UniqueName: \"kubernetes.io/projected/a19ff527-8170-4894-b42e-82cf578a53b8-kube-api-access-bthcv\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.111842 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111607 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.111842 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111633 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.111842 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111710 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a19ff527-8170-4894-b42e-82cf578a53b8-root\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.111842 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111755 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b71279ec-544a-4616-926a-e46f30b44e79-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.111842 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111795 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.111842 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111829 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-tls\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111858 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-textfile\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111892 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111928 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a19ff527-8170-4894-b42e-82cf578a53b8-sys\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111953 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-accelerators-collector-config\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.111984 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b71279ec-544a-4616-926a-e46f30b44e79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112001 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a19ff527-8170-4894-b42e-82cf578a53b8-root\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112009 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcmh\" (UniqueName: \"kubernetes.io/projected/b71279ec-544a-4616-926a-e46f30b44e79-kube-api-access-4wcmh\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112074 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112106 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhzf2\" (UniqueName: \"kubernetes.io/projected/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-api-access-bhzf2\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.112128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112131 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-wtmp\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.112727 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112179 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.112727 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112300 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a19ff527-8170-4894-b42e-82cf578a53b8-sys\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.112727 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112296 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a19ff527-8170-4894-b42e-82cf578a53b8-metrics-client-ca\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.112727 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112377 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.112727 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:43:02.112406 2564 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 02:43:02.112727 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:43:02.112462 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b71279ec-544a-4616-926a-e46f30b44e79-openshift-state-metrics-tls podName:b71279ec-544a-4616-926a-e46f30b44e79 nodeName:}" failed. No retries permitted until 2026-04-21 02:43:02.612443497 +0000 UTC m=+115.379932167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b71279ec-544a-4616-926a-e46f30b44e79-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-zjcs7" (UID: "b71279ec-544a-4616-926a-e46f30b44e79") : secret "openshift-state-metrics-tls" not found Apr 21 02:43:02.112727 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112693 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b71279ec-544a-4616-926a-e46f30b44e79-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.113084 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112878 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-accelerators-collector-config\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.113084 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.112956 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-textfile\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.113084 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.113018 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.113238 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.113195 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-wtmp\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.113322 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.113287 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.114801 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.114758 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b71279ec-544a-4616-926a-e46f30b44e79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.115330 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.115311 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.115771 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.115747 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.115960 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.115935 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.116326 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.116309 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a19ff527-8170-4894-b42e-82cf578a53b8-node-exporter-tls\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.120474 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.120451 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bthcv\" (UniqueName: \"kubernetes.io/projected/a19ff527-8170-4894-b42e-82cf578a53b8-kube-api-access-bthcv\") pod \"node-exporter-6v2w9\" (UID: \"a19ff527-8170-4894-b42e-82cf578a53b8\") " pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.120585 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.120471 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcmh\" (UniqueName: \"kubernetes.io/projected/b71279ec-544a-4616-926a-e46f30b44e79-kube-api-access-4wcmh\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.121016 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.120995 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhzf2\" (UniqueName: \"kubernetes.io/projected/bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8-kube-api-access-bhzf2\") pod \"kube-state-metrics-69db897b98-xzcxn\" (UID: \"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.210543 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.210518 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" Apr 21 02:43:02.218453 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.218427 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6v2w9" Apr 21 02:43:02.243656 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.243626 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554b4c9b8b-xbtp7" event={"ID":"146fade0-ee27-4353-b36c-7e31717edebb","Type":"ContainerStarted","Data":"0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d"} Apr 21 02:43:02.244997 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.244959 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v2w9" event={"ID":"a19ff527-8170-4894-b42e-82cf578a53b8","Type":"ContainerStarted","Data":"ffe0f3a04e44a725403118642aa7f831565a43903433320e59ce7c7ca7b29cbe"} Apr 21 02:43:02.261126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.261081 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-554b4c9b8b-xbtp7" podStartSLOduration=1.8262258139999998 podStartE2EDuration="5.261067198s" podCreationTimestamp="2026-04-21 02:42:57 +0000 UTC" firstStartedPulling="2026-04-21 02:42:58.397991111 +0000 UTC m=+111.165479784" lastFinishedPulling="2026-04-21 02:43:01.832832486 +0000 UTC m=+114.600321168" observedRunningTime="2026-04-21 02:43:02.259435768 +0000 UTC m=+115.026924471" watchObservedRunningTime="2026-04-21 02:43:02.261067198 +0000 UTC m=+115.028555884" Apr 21 02:43:02.350289 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.350126 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xzcxn"] Apr 21 02:43:02.352967 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:43:02.352937 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc9d2f69_ae25_4694_bb4a_a880e7e9d3f8.slice/crio-d3b2667222c06f62a477d17e09e7224d1950950f7bd070144b72fd6a0f7635e2 WatchSource:0}: Error finding container d3b2667222c06f62a477d17e09e7224d1950950f7bd070144b72fd6a0f7635e2: Status 404 returned error can't find the container with id d3b2667222c06f62a477d17e09e7224d1950950f7bd070144b72fd6a0f7635e2 Apr 21 02:43:02.617612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.617576 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b71279ec-544a-4616-926a-e46f30b44e79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.620679 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.620657 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b71279ec-544a-4616-926a-e46f30b44e79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zjcs7\" (UID: \"b71279ec-544a-4616-926a-e46f30b44e79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.801069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.801040 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" Apr 21 02:43:02.961060 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:02.960874 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7"] Apr 21 02:43:03.052313 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:43:03.052279 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb71279ec_544a_4616_926a_e46f30b44e79.slice/crio-e32643d3ef3c526bf4c221b413441e0dca129a172f273c3e3c8ebc15a44d0b31 WatchSource:0}: Error finding container e32643d3ef3c526bf4c221b413441e0dca129a172f273c3e3c8ebc15a44d0b31: Status 404 returned error can't find the container with id e32643d3ef3c526bf4c221b413441e0dca129a172f273c3e3c8ebc15a44d0b31 Apr 21 02:43:03.254950 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:03.253336 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v2w9" event={"ID":"a19ff527-8170-4894-b42e-82cf578a53b8","Type":"ContainerStarted","Data":"9b110b776872531fec428c4c03f79e4ddcb4bf14de11ebca6300e506c13abdb4"} Apr 21 02:43:03.257116 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:03.257032 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" event={"ID":"b71279ec-544a-4616-926a-e46f30b44e79","Type":"ContainerStarted","Data":"8e5ed6f4232b1fd866eff720eea7ad5cd8a48a7fbd860e21a1d6a0a74b9fa19b"} Apr 21 02:43:03.257116 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:03.257068 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" event={"ID":"b71279ec-544a-4616-926a-e46f30b44e79","Type":"ContainerStarted","Data":"bbb0589cedcedd08b4d37ac4956a724695f90209bcc49d5c52827a53027e47cb"} Apr 21 02:43:03.257116 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:03.257082 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" event={"ID":"b71279ec-544a-4616-926a-e46f30b44e79","Type":"ContainerStarted","Data":"e32643d3ef3c526bf4c221b413441e0dca129a172f273c3e3c8ebc15a44d0b31"} Apr 21 02:43:03.259181 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:03.259155 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" event={"ID":"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8","Type":"ContainerStarted","Data":"d3b2667222c06f62a477d17e09e7224d1950950f7bd070144b72fd6a0f7635e2"} Apr 21 02:43:04.265158 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:04.264430 2564 generic.go:358] "Generic (PLEG): container finished" podID="a19ff527-8170-4894-b42e-82cf578a53b8" containerID="9b110b776872531fec428c4c03f79e4ddcb4bf14de11ebca6300e506c13abdb4" exitCode=0 Apr 21 02:43:04.265158 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:04.264768 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v2w9" event={"ID":"a19ff527-8170-4894-b42e-82cf578a53b8","Type":"ContainerDied","Data":"9b110b776872531fec428c4c03f79e4ddcb4bf14de11ebca6300e506c13abdb4"} Apr 21 02:43:04.268763 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:04.268691 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" event={"ID":"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8","Type":"ContainerStarted","Data":"e2bc1f4419e7e8cca777cb5613151bd50b0a8592a210bdd18e417fea9b1e7f17"} Apr 21 02:43:04.268763 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:04.268727 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" event={"ID":"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8","Type":"ContainerStarted","Data":"5395d798d31ab0c5dbcbc622fdc1140e63b73152ecf3cc88bda4276674ddbbd6"} Apr 21 02:43:04.268763 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:04.268740 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" event={"ID":"bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8","Type":"ContainerStarted","Data":"4f11d02154415145ca6cf6cd4cebad49d30b9966bec71960d1627e5e79ab5cce"} Apr 21 02:43:04.301333 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:04.301262 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzcxn" podStartSLOduration=1.781690201 podStartE2EDuration="3.301246634s" podCreationTimestamp="2026-04-21 02:43:01 +0000 UTC" firstStartedPulling="2026-04-21 02:43:02.355073122 +0000 UTC m=+115.122561789" lastFinishedPulling="2026-04-21 02:43:03.874629543 +0000 UTC m=+116.642118222" observedRunningTime="2026-04-21 02:43:04.299774582 +0000 UTC m=+117.067263272" watchObservedRunningTime="2026-04-21 02:43:04.301246634 +0000 UTC m=+117.068735325" Apr 21 02:43:05.274256 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:05.274214 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v2w9" event={"ID":"a19ff527-8170-4894-b42e-82cf578a53b8","Type":"ContainerStarted","Data":"3835c4a8689842a4d1d21ea9bb3c3d286e4d8c924d9ec835956f9a2a1d406f75"} Apr 21 02:43:05.274256 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:05.274260 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v2w9" event={"ID":"a19ff527-8170-4894-b42e-82cf578a53b8","Type":"ContainerStarted","Data":"765e4fb406b8b9ddc871c3c1f9636a1adc96a396e58abcb954bfc3b163c37bae"} Apr 21 02:43:05.276314 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:05.276280 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" event={"ID":"b71279ec-544a-4616-926a-e46f30b44e79","Type":"ContainerStarted","Data":"22795710cdf45c2701a3c2d0d956bb181c8a9b03a205294c40e1569278d919a7"} Apr 21 02:43:05.292235 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:05.292183 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6v2w9" podStartSLOduration=3.418201356 podStartE2EDuration="4.292166182s" podCreationTimestamp="2026-04-21 02:43:01 +0000 UTC" firstStartedPulling="2026-04-21 02:43:02.231213671 +0000 UTC m=+114.998702357" lastFinishedPulling="2026-04-21 02:43:03.105178504 +0000 UTC m=+115.872667183" observedRunningTime="2026-04-21 02:43:05.291391667 +0000 UTC m=+118.058880359" watchObservedRunningTime="2026-04-21 02:43:05.292166182 +0000 UTC m=+118.059654871" Apr 21 02:43:05.312873 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:05.312832 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zjcs7" podStartSLOduration=2.89767548 podStartE2EDuration="4.312817716s" podCreationTimestamp="2026-04-21 02:43:01 +0000 UTC" firstStartedPulling="2026-04-21 02:43:03.212081352 +0000 UTC m=+115.979570033" lastFinishedPulling="2026-04-21 02:43:04.627223585 +0000 UTC m=+117.394712269" observedRunningTime="2026-04-21 02:43:05.311765959 +0000 UTC m=+118.079254650" watchObservedRunningTime="2026-04-21 02:43:05.312817716 +0000 UTC m=+118.080306406" Apr 21 02:43:06.286871 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.286841 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-64cf96cf4f-pmz7q"] Apr 21 02:43:06.291568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.291546 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.293723 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.293700 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 02:43:06.293808 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.293747 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-eitrqf2r90jog\"" Apr 21 02:43:06.294786 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.294764 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 02:43:06.294786 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.294778 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 02:43:06.294940 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.294802 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-4lmbp\"" Apr 21 02:43:06.294940 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.294895 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 02:43:06.299812 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.299637 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64cf96cf4f-pmz7q"] Apr 21 02:43:06.460616 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.460578 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511a0b61-11aa-4e50-8cae-9f8fd2978a00-client-ca-bundle\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.460788 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.460634 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsgdb\" (UniqueName: \"kubernetes.io/projected/511a0b61-11aa-4e50-8cae-9f8fd2978a00-kube-api-access-wsgdb\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.460788 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.460656 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/511a0b61-11aa-4e50-8cae-9f8fd2978a00-secret-metrics-server-client-certs\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.460788 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.460725 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/511a0b61-11aa-4e50-8cae-9f8fd2978a00-audit-log\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.460788 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.460784 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/511a0b61-11aa-4e50-8cae-9f8fd2978a00-secret-metrics-server-tls\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.460992 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.460830 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/511a0b61-11aa-4e50-8cae-9f8fd2978a00-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.460992 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.460895 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/511a0b61-11aa-4e50-8cae-9f8fd2978a00-metrics-server-audit-profiles\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.562013 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.561926 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsgdb\" (UniqueName: \"kubernetes.io/projected/511a0b61-11aa-4e50-8cae-9f8fd2978a00-kube-api-access-wsgdb\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.562013 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.561978 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/511a0b61-11aa-4e50-8cae-9f8fd2978a00-secret-metrics-server-client-certs\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.562236 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.562019 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/511a0b61-11aa-4e50-8cae-9f8fd2978a00-audit-log\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.562236 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.562061 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/511a0b61-11aa-4e50-8cae-9f8fd2978a00-secret-metrics-server-tls\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.562236 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.562103 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/511a0b61-11aa-4e50-8cae-9f8fd2978a00-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.562236 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.562147 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/511a0b61-11aa-4e50-8cae-9f8fd2978a00-metrics-server-audit-profiles\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.562236 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.562194 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511a0b61-11aa-4e50-8cae-9f8fd2978a00-client-ca-bundle\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.562918 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.562895 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/511a0b61-11aa-4e50-8cae-9f8fd2978a00-audit-log\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.563258 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.563237 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/511a0b61-11aa-4e50-8cae-9f8fd2978a00-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.563642 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.563621 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/511a0b61-11aa-4e50-8cae-9f8fd2978a00-metrics-server-audit-profiles\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.565131 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.565098 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511a0b61-11aa-4e50-8cae-9f8fd2978a00-client-ca-bundle\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.565223 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.565145 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/511a0b61-11aa-4e50-8cae-9f8fd2978a00-secret-metrics-server-client-certs\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.565602 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.565581 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/511a0b61-11aa-4e50-8cae-9f8fd2978a00-secret-metrics-server-tls\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.569243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.569221 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsgdb\" (UniqueName: \"kubernetes.io/projected/511a0b61-11aa-4e50-8cae-9f8fd2978a00-kube-api-access-wsgdb\") pod \"metrics-server-64cf96cf4f-pmz7q\" (UID: \"511a0b61-11aa-4e50-8cae-9f8fd2978a00\") " pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.603476 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.603093 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:06.650905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.650878 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q"] Apr 21 02:43:06.655544 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.655519 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" Apr 21 02:43:06.657700 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.657679 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 02:43:06.657820 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.657756 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-z2s8v\"" Apr 21 02:43:06.661730 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.661695 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q"] Apr 21 02:43:06.763449 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.763421 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/518ac0b9-1cbc-45a2-a669-593b5e3aacf4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5qc6q\" (UID: \"518ac0b9-1cbc-45a2-a669-593b5e3aacf4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" Apr 21 02:43:06.864621 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.864541 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/518ac0b9-1cbc-45a2-a669-593b5e3aacf4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5qc6q\" (UID: \"518ac0b9-1cbc-45a2-a669-593b5e3aacf4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" Apr 21 02:43:06.867185 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.867155 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/518ac0b9-1cbc-45a2-a669-593b5e3aacf4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-5qc6q\" (UID: \"518ac0b9-1cbc-45a2-a669-593b5e3aacf4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" Apr 21 02:43:06.943180 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.943149 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ccf7b6bbc-jv6tn"] Apr 21 02:43:06.950518 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.950480 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:06.955391 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.955369 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ccf7b6bbc-jv6tn"] Apr 21 02:43:06.957908 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.957890 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 02:43:06.968168 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:06.968150 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" Apr 21 02:43:07.066339 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.066286 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-config\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.066339 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.066343 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtl5\" (UniqueName: \"kubernetes.io/projected/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-kube-api-access-gjtl5\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.066607 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.066369 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-serving-cert\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.066607 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.066423 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-trusted-ca-bundle\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.066607 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.066455 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-oauth-serving-cert\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.066607 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.066484 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-oauth-config\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.066607 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.066532 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-service-ca\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.167794 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.167715 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-trusted-ca-bundle\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.167794 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.167756 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-oauth-serving-cert\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.167794 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.167793 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-oauth-config\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.168069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.167824 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-service-ca\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.168069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.167889 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-config\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.168069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.167921 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtl5\" (UniqueName: \"kubernetes.io/projected/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-kube-api-access-gjtl5\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.168069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.167953 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-serving-cert\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.168694 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.168659 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-oauth-serving-cert\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.168838 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.168811 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-trusted-ca-bundle\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.169285 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.169261 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-config\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.169365 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.169281 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-service-ca\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.170764 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.170739 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-serving-cert\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.170764 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.170753 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-oauth-config\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.178718 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.178696 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtl5\" (UniqueName: \"kubernetes.io/projected/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-kube-api-access-gjtl5\") pod \"console-6ccf7b6bbc-jv6tn\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:07.262157 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:07.262123 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:08.265817 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:08.265776 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:43:08.266320 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:08.265931 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:43:08.272032 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:08.272010 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:43:08.291718 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:08.291692 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:43:12.653174 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:12.653095 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ccf7b6bbc-jv6tn"] Apr 21 02:43:12.664662 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:12.664612 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64cf96cf4f-pmz7q"] Apr 21 02:43:12.667862 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:43:12.667825 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod511a0b61_11aa_4e50_8cae_9f8fd2978a00.slice/crio-df6b3c86bc1576540a0a9e025dc1b8ac38c72846b57d0d25caf03ca9e8694f62 WatchSource:0}: Error finding container df6b3c86bc1576540a0a9e025dc1b8ac38c72846b57d0d25caf03ca9e8694f62: Status 404 returned error can't find the container with id df6b3c86bc1576540a0a9e025dc1b8ac38c72846b57d0d25caf03ca9e8694f62 Apr 21 02:43:12.682995 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:12.682962 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q"] Apr 21 02:43:12.687467 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:43:12.687443 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod518ac0b9_1cbc_45a2_a669_593b5e3aacf4.slice/crio-552d06db59691c6363722da2932baccadb03e5b6c0be9a0004f1518af468a544 WatchSource:0}: Error finding container 552d06db59691c6363722da2932baccadb03e5b6c0be9a0004f1518af468a544: Status 404 returned error can't find the container with id 552d06db59691c6363722da2932baccadb03e5b6c0be9a0004f1518af468a544 Apr 21 02:43:13.304389 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:13.304306 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" event={"ID":"518ac0b9-1cbc-45a2-a669-593b5e3aacf4","Type":"ContainerStarted","Data":"552d06db59691c6363722da2932baccadb03e5b6c0be9a0004f1518af468a544"} Apr 21 02:43:13.306489 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:13.306198 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8zx8l" event={"ID":"56684ff1-c656-4b66-8fa3-541d09278ff9","Type":"ContainerStarted","Data":"55c52e39af90f541bec3ab475a0c238479cf7c75833436c507a5fca23bfc78a3"} Apr 21 02:43:13.306881 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:13.306527 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-8zx8l" Apr 21 02:43:13.309632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:13.309599 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" event={"ID":"511a0b61-11aa-4e50-8cae-9f8fd2978a00","Type":"ContainerStarted","Data":"df6b3c86bc1576540a0a9e025dc1b8ac38c72846b57d0d25caf03ca9e8694f62"} Apr 21 02:43:13.311958 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:13.311675 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccf7b6bbc-jv6tn" event={"ID":"c7c92a29-29f5-4a7f-a419-4c10e1c738f7","Type":"ContainerStarted","Data":"cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728"} Apr 21 02:43:13.311958 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:13.311710 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccf7b6bbc-jv6tn" event={"ID":"c7c92a29-29f5-4a7f-a419-4c10e1c738f7","Type":"ContainerStarted","Data":"bd1fd28d45c786a3987bf5d516a54f2dca3e374c33a1a464e7ca1cac1c884909"} Apr 21 02:43:13.319866 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:13.319840 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-8zx8l" Apr 21 02:43:13.327630 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:13.327464 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-8zx8l" podStartSLOduration=2.178651598 podStartE2EDuration="19.327448439s" podCreationTimestamp="2026-04-21 02:42:54 +0000 UTC" firstStartedPulling="2026-04-21 02:42:55.474255797 +0000 UTC m=+108.241744465" lastFinishedPulling="2026-04-21 02:43:12.623052616 +0000 UTC m=+125.390541306" observedRunningTime="2026-04-21 02:43:13.325326021 +0000 UTC m=+126.092814723" watchObservedRunningTime="2026-04-21 02:43:13.327448439 +0000 UTC m=+126.094937133" Apr 21 02:43:13.367729 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:13.367669 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ccf7b6bbc-jv6tn" podStartSLOduration=7.367654165 podStartE2EDuration="7.367654165s" podCreationTimestamp="2026-04-21 02:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:43:13.347814725 +0000 UTC m=+126.115303429" watchObservedRunningTime="2026-04-21 02:43:13.367654165 +0000 UTC m=+126.135142866" Apr 21 02:43:16.323788 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:16.323747 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" event={"ID":"511a0b61-11aa-4e50-8cae-9f8fd2978a00","Type":"ContainerStarted","Data":"f4dc13df758afd2015f20172014599bb09715d65d28eb7835540cbb5eedc9209"} Apr 21 02:43:16.325361 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:16.325302 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" event={"ID":"518ac0b9-1cbc-45a2-a669-593b5e3aacf4","Type":"ContainerStarted","Data":"893ee84d8987e52a7b986596212a014e0c1d4385d43d03dbbf73edb844423e09"} Apr 21 02:43:16.325817 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:16.325715 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" Apr 21 02:43:16.331353 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:16.331330 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" Apr 21 02:43:16.340657 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:16.340618 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" podStartSLOduration=7.778436695 podStartE2EDuration="10.340607326s" podCreationTimestamp="2026-04-21 02:43:06 +0000 UTC" firstStartedPulling="2026-04-21 02:43:12.672158468 +0000 UTC m=+125.439647135" lastFinishedPulling="2026-04-21 02:43:15.234329096 +0000 UTC m=+128.001817766" observedRunningTime="2026-04-21 02:43:16.339068296 +0000 UTC m=+129.106556984" watchObservedRunningTime="2026-04-21 02:43:16.340607326 +0000 UTC m=+129.108096009" Apr 21 02:43:16.355923 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:16.355881 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-5qc6q" podStartSLOduration=7.80722238 podStartE2EDuration="10.355869746s" podCreationTimestamp="2026-04-21 02:43:06 +0000 UTC" firstStartedPulling="2026-04-21 02:43:12.688981978 +0000 UTC m=+125.456470645" lastFinishedPulling="2026-04-21 02:43:15.23762933 +0000 UTC m=+128.005118011" observedRunningTime="2026-04-21 02:43:16.354522139 +0000 UTC m=+129.122010828" watchObservedRunningTime="2026-04-21 02:43:16.355869746 +0000 UTC m=+129.123358435" Apr 21 02:43:17.263386 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.263355 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:17.263386 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.263395 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:17.272197 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.272170 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:17.332414 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.332384 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:43:17.375673 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.375646 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-554b4c9b8b-xbtp7"] Apr 21 02:43:17.567096 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.567010 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:43:17.569788 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.569766 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0d17ad-bc94-4499-bb04-b7e0df549a24-metrics-certs\") pod \"network-metrics-daemon-77bqp\" (UID: \"da0d17ad-bc94-4499-bb04-b7e0df549a24\") " pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:43:17.760380 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.760352 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j2fpz\"" Apr 21 02:43:17.768882 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.768855 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-77bqp" Apr 21 02:43:17.904247 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:17.904215 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-77bqp"] Apr 21 02:43:17.922254 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:43:17.922224 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda0d17ad_bc94_4499_bb04_b7e0df549a24.slice/crio-c67ef188d1bbbe1d71b71600015c2b83e3a36014c66216c73971b10f59006420 WatchSource:0}: Error finding container c67ef188d1bbbe1d71b71600015c2b83e3a36014c66216c73971b10f59006420: Status 404 returned error can't find the container with id c67ef188d1bbbe1d71b71600015c2b83e3a36014c66216c73971b10f59006420 Apr 21 02:43:18.331985 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:18.331947 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-77bqp" event={"ID":"da0d17ad-bc94-4499-bb04-b7e0df549a24","Type":"ContainerStarted","Data":"c67ef188d1bbbe1d71b71600015c2b83e3a36014c66216c73971b10f59006420"} Apr 21 02:43:20.343029 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:20.342988 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-77bqp" event={"ID":"da0d17ad-bc94-4499-bb04-b7e0df549a24","Type":"ContainerStarted","Data":"627e074bd922d6dd549bf47ff36c93f9f43ab8685d981511e5de44da6a08fdc4"} Apr 21 02:43:20.343029 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:20.343032 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-77bqp" event={"ID":"da0d17ad-bc94-4499-bb04-b7e0df549a24","Type":"ContainerStarted","Data":"b2fdd33f31e6df6cab229837b72ede8eae398efa55d3a18ae5399629d9501c29"} Apr 21 02:43:20.359298 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:20.359250 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-77bqp" podStartSLOduration=131.683096325 podStartE2EDuration="2m13.359233895s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:43:17.924542982 +0000 UTC m=+130.692031663" lastFinishedPulling="2026-04-21 02:43:19.600680564 +0000 UTC m=+132.368169233" observedRunningTime="2026-04-21 02:43:20.356839293 +0000 UTC m=+133.124327983" watchObservedRunningTime="2026-04-21 02:43:20.359233895 +0000 UTC m=+133.126722585" Apr 21 02:43:26.604481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:26.604447 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:26.604481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:26.604485 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:35.391102 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:35.391068 2564 generic.go:358] "Generic (PLEG): container finished" podID="75d5f030-7365-4e7d-92ef-15593dbe87f9" containerID="71bd3b189c4d3556a48c9bb699fea76c02e8cf127ce9f7f7444a179782cc058e" exitCode=0 Apr 21 02:43:35.391527 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:35.391138 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8rwzw" event={"ID":"75d5f030-7365-4e7d-92ef-15593dbe87f9","Type":"ContainerDied","Data":"71bd3b189c4d3556a48c9bb699fea76c02e8cf127ce9f7f7444a179782cc058e"} Apr 21 02:43:35.391527 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:35.391524 2564 scope.go:117] "RemoveContainer" containerID="71bd3b189c4d3556a48c9bb699fea76c02e8cf127ce9f7f7444a179782cc058e" Apr 21 02:43:35.392586 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:35.392556 2564 generic.go:358] "Generic (PLEG): container finished" podID="385cbf98-cb51-4378-92b4-0aa2cdebc70f" containerID="76b6ce6ed12368734a82e9bc222a02acd9ba212213635c375986a5d30b2af92b" exitCode=0 Apr 21 02:43:35.392691 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:35.392629 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" event={"ID":"385cbf98-cb51-4378-92b4-0aa2cdebc70f","Type":"ContainerDied","Data":"76b6ce6ed12368734a82e9bc222a02acd9ba212213635c375986a5d30b2af92b"} Apr 21 02:43:35.392916 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:35.392901 2564 scope.go:117] "RemoveContainer" containerID="76b6ce6ed12368734a82e9bc222a02acd9ba212213635c375986a5d30b2af92b" Apr 21 02:43:36.396837 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:36.396800 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8rwzw" event={"ID":"75d5f030-7365-4e7d-92ef-15593dbe87f9","Type":"ContainerStarted","Data":"355cf3fbf5703abd871a482d1cdb4baffd0bac161053cae62d9da2cdb62c6376"} Apr 21 02:43:36.398509 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:36.398467 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-j7pzz" event={"ID":"385cbf98-cb51-4378-92b4-0aa2cdebc70f","Type":"ContainerStarted","Data":"3865ced2420490ff81e627108558bca8b56b179e46ce9cd11614fa853a3c1e14"} Apr 21 02:43:42.398131 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.398064 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-554b4c9b8b-xbtp7" podUID="146fade0-ee27-4353-b36c-7e31717edebb" containerName="console" containerID="cri-o://0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d" gracePeriod=15 Apr 21 02:43:42.664936 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.664910 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-554b4c9b8b-xbtp7_146fade0-ee27-4353-b36c-7e31717edebb/console/0.log" Apr 21 02:43:42.665037 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.664967 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:43:42.754910 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.754886 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-oauth-serving-cert\") pod \"146fade0-ee27-4353-b36c-7e31717edebb\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " Apr 21 02:43:42.755014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.754947 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-service-ca\") pod \"146fade0-ee27-4353-b36c-7e31717edebb\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " Apr 21 02:43:42.755014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.755001 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-console-config\") pod \"146fade0-ee27-4353-b36c-7e31717edebb\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " Apr 21 02:43:42.755085 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.755028 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-oauth-config\") pod \"146fade0-ee27-4353-b36c-7e31717edebb\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " Apr 21 02:43:42.755085 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.755052 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jj74\" (UniqueName: \"kubernetes.io/projected/146fade0-ee27-4353-b36c-7e31717edebb-kube-api-access-6jj74\") pod \"146fade0-ee27-4353-b36c-7e31717edebb\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " Apr 21 02:43:42.755162 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.755089 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-serving-cert\") pod \"146fade0-ee27-4353-b36c-7e31717edebb\" (UID: \"146fade0-ee27-4353-b36c-7e31717edebb\") " Apr 21 02:43:42.755292 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.755270 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-service-ca" (OuterVolumeSpecName: "service-ca") pod "146fade0-ee27-4353-b36c-7e31717edebb" (UID: "146fade0-ee27-4353-b36c-7e31717edebb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:43:42.755362 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.755290 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "146fade0-ee27-4353-b36c-7e31717edebb" (UID: "146fade0-ee27-4353-b36c-7e31717edebb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:43:42.755420 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.755388 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-console-config" (OuterVolumeSpecName: "console-config") pod "146fade0-ee27-4353-b36c-7e31717edebb" (UID: "146fade0-ee27-4353-b36c-7e31717edebb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:43:42.757221 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.757182 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146fade0-ee27-4353-b36c-7e31717edebb-kube-api-access-6jj74" (OuterVolumeSpecName: "kube-api-access-6jj74") pod "146fade0-ee27-4353-b36c-7e31717edebb" (UID: "146fade0-ee27-4353-b36c-7e31717edebb"). InnerVolumeSpecName "kube-api-access-6jj74". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:43:42.757313 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.757220 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "146fade0-ee27-4353-b36c-7e31717edebb" (UID: "146fade0-ee27-4353-b36c-7e31717edebb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:43:42.757313 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.757269 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "146fade0-ee27-4353-b36c-7e31717edebb" (UID: "146fade0-ee27-4353-b36c-7e31717edebb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:43:42.855887 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.855866 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-serving-cert\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:43:42.855887 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.855887 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-oauth-serving-cert\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:43:42.856004 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.855898 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-service-ca\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:43:42.856004 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.855907 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146fade0-ee27-4353-b36c-7e31717edebb-console-config\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:43:42.856004 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.855914 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146fade0-ee27-4353-b36c-7e31717edebb-console-oauth-config\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:43:42.856004 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:42.855923 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jj74\" (UniqueName: \"kubernetes.io/projected/146fade0-ee27-4353-b36c-7e31717edebb-kube-api-access-6jj74\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:43:43.419395 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.419367 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-554b4c9b8b-xbtp7_146fade0-ee27-4353-b36c-7e31717edebb/console/0.log" Apr 21 02:43:43.419818 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.419405 2564 generic.go:358] "Generic (PLEG): container finished" podID="146fade0-ee27-4353-b36c-7e31717edebb" containerID="0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d" exitCode=2 Apr 21 02:43:43.419818 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.419469 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554b4c9b8b-xbtp7" Apr 21 02:43:43.419818 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.419473 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554b4c9b8b-xbtp7" event={"ID":"146fade0-ee27-4353-b36c-7e31717edebb","Type":"ContainerDied","Data":"0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d"} Apr 21 02:43:43.419818 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.419516 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554b4c9b8b-xbtp7" event={"ID":"146fade0-ee27-4353-b36c-7e31717edebb","Type":"ContainerDied","Data":"edb4d80d957a4c24c187b683cede32d5cbc30a24817f5565889547c3fefe1d55"} Apr 21 02:43:43.419818 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.419533 2564 scope.go:117] "RemoveContainer" containerID="0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d" Apr 21 02:43:43.428122 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.428104 2564 scope.go:117] "RemoveContainer" containerID="0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d" Apr 21 02:43:43.428358 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:43:43.428338 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d\": container with ID starting with 0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d not found: ID does not exist" containerID="0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d" Apr 21 02:43:43.428422 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.428366 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d"} err="failed to get container status \"0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d\": rpc error: code = NotFound desc = could not find container \"0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d\": container with ID starting with 0714f7f522a26493de5d8dde2284dca5b87eb9a3f7f6330b6e351c8ffe6ef69d not found: ID does not exist" Apr 21 02:43:43.439142 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.439119 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-554b4c9b8b-xbtp7"] Apr 21 02:43:43.442710 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.442677 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-554b4c9b8b-xbtp7"] Apr 21 02:43:43.844667 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:43.844617 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146fade0-ee27-4353-b36c-7e31717edebb" path="/var/lib/kubelet/pods/146fade0-ee27-4353-b36c-7e31717edebb/volumes" Apr 21 02:43:46.608844 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:46.608817 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:43:46.613092 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:43:46.613063 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-64cf96cf4f-pmz7q" Apr 21 02:44:26.139068 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.139032 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-586bd48bcb-glvsc"] Apr 21 02:44:26.140050 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.140024 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146fade0-ee27-4353-b36c-7e31717edebb" containerName="console" Apr 21 02:44:26.140193 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.140181 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="146fade0-ee27-4353-b36c-7e31717edebb" containerName="console" Apr 21 02:44:26.140363 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.140352 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="146fade0-ee27-4353-b36c-7e31717edebb" containerName="console" Apr 21 02:44:26.143797 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.143775 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-586bd48bcb-glvsc"] Apr 21 02:44:26.144011 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.143992 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.146487 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.146463 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 02:44:26.146821 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.146804 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-dkvw8\"" Apr 21 02:44:26.146935 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.146820 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 02:44:26.146935 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.146895 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 02:44:26.148346 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.148328 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 02:44:26.148459 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.148355 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 02:44:26.155806 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.155783 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 02:44:26.261988 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.261956 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-secret-telemeter-client\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.262153 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.262001 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78ae28be-3993-4b8b-ab62-b6f09e278983-telemeter-trusted-ca-bundle\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.262153 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.262029 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.262153 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.262056 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-federate-client-tls\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.262153 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.262083 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78ae28be-3993-4b8b-ab62-b6f09e278983-serving-certs-ca-bundle\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.262153 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.262106 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-telemeter-client-tls\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.262318 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.262215 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78ae28be-3993-4b8b-ab62-b6f09e278983-metrics-client-ca\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.262318 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.262255 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8qw2\" (UniqueName: \"kubernetes.io/projected/78ae28be-3993-4b8b-ab62-b6f09e278983-kube-api-access-m8qw2\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.363113 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.363084 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78ae28be-3993-4b8b-ab62-b6f09e278983-metrics-client-ca\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.363265 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.363128 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8qw2\" (UniqueName: \"kubernetes.io/projected/78ae28be-3993-4b8b-ab62-b6f09e278983-kube-api-access-m8qw2\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.363265 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.363168 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-secret-telemeter-client\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.363265 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.363206 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78ae28be-3993-4b8b-ab62-b6f09e278983-telemeter-trusted-ca-bundle\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.363265 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.363235 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.363265 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.363263 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-federate-client-tls\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.363573 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.363296 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78ae28be-3993-4b8b-ab62-b6f09e278983-serving-certs-ca-bundle\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.363573 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.363361 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-telemeter-client-tls\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.364035 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.364011 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78ae28be-3993-4b8b-ab62-b6f09e278983-serving-certs-ca-bundle\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.364205 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.364138 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78ae28be-3993-4b8b-ab62-b6f09e278983-metrics-client-ca\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.364403 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.364384 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78ae28be-3993-4b8b-ab62-b6f09e278983-telemeter-trusted-ca-bundle\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.365985 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.365956 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-federate-client-tls\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.365985 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.365970 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.366150 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.366134 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-telemeter-client-tls\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.366402 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.366382 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/78ae28be-3993-4b8b-ab62-b6f09e278983-secret-telemeter-client\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.371053 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.371030 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8qw2\" (UniqueName: \"kubernetes.io/projected/78ae28be-3993-4b8b-ab62-b6f09e278983-kube-api-access-m8qw2\") pod \"telemeter-client-586bd48bcb-glvsc\" (UID: \"78ae28be-3993-4b8b-ab62-b6f09e278983\") " pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.458015 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.457933 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" Apr 21 02:44:26.589993 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:26.589961 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-586bd48bcb-glvsc"] Apr 21 02:44:26.594221 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:44:26.594196 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ae28be_3993_4b8b_ab62_b6f09e278983.slice/crio-495af35dd36dab83fbe593eeb0ba1c7b2a08c77e62a59fb11a7cf45506d650eb WatchSource:0}: Error finding container 495af35dd36dab83fbe593eeb0ba1c7b2a08c77e62a59fb11a7cf45506d650eb: Status 404 returned error can't find the container with id 495af35dd36dab83fbe593eeb0ba1c7b2a08c77e62a59fb11a7cf45506d650eb Apr 21 02:44:27.560094 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:27.560056 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" event={"ID":"78ae28be-3993-4b8b-ab62-b6f09e278983","Type":"ContainerStarted","Data":"495af35dd36dab83fbe593eeb0ba1c7b2a08c77e62a59fb11a7cf45506d650eb"} Apr 21 02:44:28.564293 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:28.564256 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" event={"ID":"78ae28be-3993-4b8b-ab62-b6f09e278983","Type":"ContainerStarted","Data":"d455910849626eda27e1c0f47fc81600534176667ea3a0fd2b82b5a0390b35fa"} Apr 21 02:44:29.569164 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:29.569128 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" event={"ID":"78ae28be-3993-4b8b-ab62-b6f09e278983","Type":"ContainerStarted","Data":"b8967af449e849fd400249ba56e1b0e62c319a11ac9b4f9315706566fe236bc5"} Apr 21 02:44:29.569537 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:29.569167 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" event={"ID":"78ae28be-3993-4b8b-ab62-b6f09e278983","Type":"ContainerStarted","Data":"2aee48587fed28df5ed2985a44cb6cd3cc4386a857dd171e227bd1c35bc9f3e2"} Apr 21 02:44:29.590363 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:29.590318 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-586bd48bcb-glvsc" podStartSLOduration=1.032956948 podStartE2EDuration="3.590306468s" podCreationTimestamp="2026-04-21 02:44:26 +0000 UTC" firstStartedPulling="2026-04-21 02:44:26.595874886 +0000 UTC m=+199.363363553" lastFinishedPulling="2026-04-21 02:44:29.153224386 +0000 UTC m=+201.920713073" observedRunningTime="2026-04-21 02:44:29.589177283 +0000 UTC m=+202.356665987" watchObservedRunningTime="2026-04-21 02:44:29.590306468 +0000 UTC m=+202.357795156" Apr 21 02:44:30.415430 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.415391 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56d76df976-zd82c"] Apr 21 02:44:30.419384 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.419364 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.429315 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.429290 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d76df976-zd82c"] Apr 21 02:44:30.500430 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.500405 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-console-config\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.500562 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.500433 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-oauth-serving-cert\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.500562 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.500456 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-serving-cert\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.500649 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.500557 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pw74\" (UniqueName: \"kubernetes.io/projected/8a0e0820-a088-4035-a18d-587bd27316fe-kube-api-access-2pw74\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.500649 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.500591 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-trusted-ca-bundle\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.500649 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.500631 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-oauth-config\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.500759 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.500649 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-service-ca\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.601408 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.601384 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-trusted-ca-bundle\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.601789 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.601417 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-oauth-config\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.601789 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.601436 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-service-ca\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.601789 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.601466 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-console-config\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.601789 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.601481 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-oauth-serving-cert\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.601789 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.601521 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-serving-cert\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.601789 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.601571 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pw74\" (UniqueName: \"kubernetes.io/projected/8a0e0820-a088-4035-a18d-587bd27316fe-kube-api-access-2pw74\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.602307 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.602275 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-console-config\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.602392 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.602319 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-oauth-serving-cert\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.602392 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.602342 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-service-ca\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.602561 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.602538 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-trusted-ca-bundle\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.603986 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.603960 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-oauth-config\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.604083 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.604063 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-serving-cert\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.609044 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.609026 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pw74\" (UniqueName: \"kubernetes.io/projected/8a0e0820-a088-4035-a18d-587bd27316fe-kube-api-access-2pw74\") pod \"console-56d76df976-zd82c\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.728925 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.728867 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:30.845540 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:30.845514 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d76df976-zd82c"] Apr 21 02:44:30.847840 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:44:30.847808 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a0e0820_a088_4035_a18d_587bd27316fe.slice/crio-1ff56f757b669fb1009d12344c9d01a2c68711ae53c93eeb35eed87cea32616f WatchSource:0}: Error finding container 1ff56f757b669fb1009d12344c9d01a2c68711ae53c93eeb35eed87cea32616f: Status 404 returned error can't find the container with id 1ff56f757b669fb1009d12344c9d01a2c68711ae53c93eeb35eed87cea32616f Apr 21 02:44:31.576924 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:31.576886 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d76df976-zd82c" event={"ID":"8a0e0820-a088-4035-a18d-587bd27316fe","Type":"ContainerStarted","Data":"73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2"} Apr 21 02:44:31.576924 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:31.576926 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d76df976-zd82c" event={"ID":"8a0e0820-a088-4035-a18d-587bd27316fe","Type":"ContainerStarted","Data":"1ff56f757b669fb1009d12344c9d01a2c68711ae53c93eeb35eed87cea32616f"} Apr 21 02:44:31.594054 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:31.594003 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56d76df976-zd82c" podStartSLOduration=1.593989375 podStartE2EDuration="1.593989375s" podCreationTimestamp="2026-04-21 02:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:44:31.593175916 +0000 UTC m=+204.360664606" watchObservedRunningTime="2026-04-21 02:44:31.593989375 +0000 UTC m=+204.361478064" Apr 21 02:44:40.729973 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:40.729931 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:40.729973 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:40.729983 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:40.735261 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:40.735227 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:41.609027 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:41.609000 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:44:41.669415 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:44:41.669384 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ccf7b6bbc-jv6tn"] Apr 21 02:45:06.688876 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.688818 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6ccf7b6bbc-jv6tn" podUID="c7c92a29-29f5-4a7f-a419-4c10e1c738f7" containerName="console" containerID="cri-o://cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728" gracePeriod=15 Apr 21 02:45:06.925777 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.925755 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ccf7b6bbc-jv6tn_c7c92a29-29f5-4a7f-a419-4c10e1c738f7/console/0.log" Apr 21 02:45:06.925886 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.925814 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:45:06.951926 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.951851 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-config\") pod \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " Apr 21 02:45:06.951926 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.951902 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-trusted-ca-bundle\") pod \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " Apr 21 02:45:06.952126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.951944 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtl5\" (UniqueName: \"kubernetes.io/projected/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-kube-api-access-gjtl5\") pod \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " Apr 21 02:45:06.952126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.951987 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-oauth-config\") pod \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " Apr 21 02:45:06.952126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.952035 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-serving-cert\") pod \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " Apr 21 02:45:06.952126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.952068 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-service-ca\") pod \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " Apr 21 02:45:06.952126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.952100 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-oauth-serving-cert\") pod \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\" (UID: \"c7c92a29-29f5-4a7f-a419-4c10e1c738f7\") " Apr 21 02:45:06.952377 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.952259 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-config" (OuterVolumeSpecName: "console-config") pod "c7c92a29-29f5-4a7f-a419-4c10e1c738f7" (UID: "c7c92a29-29f5-4a7f-a419-4c10e1c738f7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:45:06.952377 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.952322 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-config\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:45:06.952660 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.952627 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c7c92a29-29f5-4a7f-a419-4c10e1c738f7" (UID: "c7c92a29-29f5-4a7f-a419-4c10e1c738f7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:45:06.952754 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.952704 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c7c92a29-29f5-4a7f-a419-4c10e1c738f7" (UID: "c7c92a29-29f5-4a7f-a419-4c10e1c738f7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:45:06.952754 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.952719 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-service-ca" (OuterVolumeSpecName: "service-ca") pod "c7c92a29-29f5-4a7f-a419-4c10e1c738f7" (UID: "c7c92a29-29f5-4a7f-a419-4c10e1c738f7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:45:06.954400 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.954336 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c7c92a29-29f5-4a7f-a419-4c10e1c738f7" (UID: "c7c92a29-29f5-4a7f-a419-4c10e1c738f7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:45:06.954611 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.954483 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c7c92a29-29f5-4a7f-a419-4c10e1c738f7" (UID: "c7c92a29-29f5-4a7f-a419-4c10e1c738f7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:45:06.954677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:06.954591 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-kube-api-access-gjtl5" (OuterVolumeSpecName: "kube-api-access-gjtl5") pod "c7c92a29-29f5-4a7f-a419-4c10e1c738f7" (UID: "c7c92a29-29f5-4a7f-a419-4c10e1c738f7"). InnerVolumeSpecName "kube-api-access-gjtl5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:45:07.053393 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.053369 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-trusted-ca-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:45:07.053475 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.053396 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjtl5\" (UniqueName: \"kubernetes.io/projected/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-kube-api-access-gjtl5\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:45:07.053475 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.053408 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-oauth-config\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:45:07.053475 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.053417 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-console-serving-cert\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:45:07.053475 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.053426 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-service-ca\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:45:07.053475 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.053434 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c92a29-29f5-4a7f-a419-4c10e1c738f7-oauth-serving-cert\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:45:07.683984 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.683963 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ccf7b6bbc-jv6tn_c7c92a29-29f5-4a7f-a419-4c10e1c738f7/console/0.log" Apr 21 02:45:07.684126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.683999 2564 generic.go:358] "Generic (PLEG): container finished" podID="c7c92a29-29f5-4a7f-a419-4c10e1c738f7" containerID="cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728" exitCode=2 Apr 21 02:45:07.684126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.684049 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccf7b6bbc-jv6tn" event={"ID":"c7c92a29-29f5-4a7f-a419-4c10e1c738f7","Type":"ContainerDied","Data":"cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728"} Apr 21 02:45:07.684126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.684061 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccf7b6bbc-jv6tn" Apr 21 02:45:07.684126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.684079 2564 scope.go:117] "RemoveContainer" containerID="cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728" Apr 21 02:45:07.684274 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.684069 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccf7b6bbc-jv6tn" event={"ID":"c7c92a29-29f5-4a7f-a419-4c10e1c738f7","Type":"ContainerDied","Data":"bd1fd28d45c786a3987bf5d516a54f2dca3e374c33a1a464e7ca1cac1c884909"} Apr 21 02:45:07.693070 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.692873 2564 scope.go:117] "RemoveContainer" containerID="cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728" Apr 21 02:45:07.693271 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:45:07.693113 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728\": container with ID starting with cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728 not found: ID does not exist" containerID="cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728" Apr 21 02:45:07.693271 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.693135 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728"} err="failed to get container status \"cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728\": rpc error: code = NotFound desc = could not find container \"cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728\": container with ID starting with cb416550135e4593593dfb26af79ad7294d146bcd6a94ef3ebc734fbcc14b728 not found: ID does not exist" Apr 21 02:45:07.707361 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.707342 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ccf7b6bbc-jv6tn"] Apr 21 02:45:07.715274 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.715254 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6ccf7b6bbc-jv6tn"] Apr 21 02:45:07.844332 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:07.844308 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c92a29-29f5-4a7f-a419-4c10e1c738f7" path="/var/lib/kubelet/pods/c7c92a29-29f5-4a7f-a419-4c10e1c738f7/volumes" Apr 21 02:45:39.441583 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.441489 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68954b648f-pdbw4"] Apr 21 02:45:39.442042 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.441790 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7c92a29-29f5-4a7f-a419-4c10e1c738f7" containerName="console" Apr 21 02:45:39.442042 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.441801 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c92a29-29f5-4a7f-a419-4c10e1c738f7" containerName="console" Apr 21 02:45:39.442042 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.441861 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7c92a29-29f5-4a7f-a419-4c10e1c738f7" containerName="console" Apr 21 02:45:39.444729 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.444710 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.468211 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.468188 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68954b648f-pdbw4"] Apr 21 02:45:39.475938 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.475914 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-trusted-ca-bundle\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.476065 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.476005 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-serving-cert\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.476065 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.476038 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-oauth-serving-cert\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.476185 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.476073 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-console-config\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.476185 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.476108 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-oauth-config\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.476185 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.476123 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhdm9\" (UniqueName: \"kubernetes.io/projected/464ddfa0-b507-4644-b148-9f35f5f15c98-kube-api-access-hhdm9\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.476185 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.476144 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-service-ca\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.576787 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.576763 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-serving-cert\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.576892 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.576797 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-oauth-serving-cert\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.576892 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.576827 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-console-config\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.576962 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.576932 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-oauth-config\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.576996 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.576972 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhdm9\" (UniqueName: \"kubernetes.io/projected/464ddfa0-b507-4644-b148-9f35f5f15c98-kube-api-access-hhdm9\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.577030 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.577001 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-service-ca\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.577077 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.577031 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-trusted-ca-bundle\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.577600 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.577578 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-oauth-serving-cert\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.577600 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.577589 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-console-config\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.577810 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.577785 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-trusted-ca-bundle\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.577932 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.577870 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-service-ca\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.579273 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.579253 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-oauth-config\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.579430 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.579412 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-serving-cert\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.587741 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.587723 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhdm9\" (UniqueName: \"kubernetes.io/projected/464ddfa0-b507-4644-b148-9f35f5f15c98-kube-api-access-hhdm9\") pod \"console-68954b648f-pdbw4\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.753560 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.753542 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:39.872034 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:39.872013 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68954b648f-pdbw4"] Apr 21 02:45:39.874464 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:45:39.874437 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod464ddfa0_b507_4644_b148_9f35f5f15c98.slice/crio-563840ff3514d232a3a2fdbd250c1e53ae1d70d596255e9f18f9a1f36ff045e1 WatchSource:0}: Error finding container 563840ff3514d232a3a2fdbd250c1e53ae1d70d596255e9f18f9a1f36ff045e1: Status 404 returned error can't find the container with id 563840ff3514d232a3a2fdbd250c1e53ae1d70d596255e9f18f9a1f36ff045e1 Apr 21 02:45:40.784169 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:40.784137 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68954b648f-pdbw4" event={"ID":"464ddfa0-b507-4644-b148-9f35f5f15c98","Type":"ContainerStarted","Data":"8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c"} Apr 21 02:45:40.784169 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:40.784174 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68954b648f-pdbw4" event={"ID":"464ddfa0-b507-4644-b148-9f35f5f15c98","Type":"ContainerStarted","Data":"563840ff3514d232a3a2fdbd250c1e53ae1d70d596255e9f18f9a1f36ff045e1"} Apr 21 02:45:40.810798 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:40.810757 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68954b648f-pdbw4" podStartSLOduration=1.810744128 podStartE2EDuration="1.810744128s" podCreationTimestamp="2026-04-21 02:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:45:40.808983113 +0000 UTC m=+273.576471803" watchObservedRunningTime="2026-04-21 02:45:40.810744128 +0000 UTC m=+273.578232817" Apr 21 02:45:49.754193 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:49.754140 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:49.754193 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:49.754201 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:49.759509 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:49.759465 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:49.817402 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:49.817375 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:45:49.872451 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:45:49.872424 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d76df976-zd82c"] Apr 21 02:46:03.280076 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.280041 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qf6p4"] Apr 21 02:46:03.283182 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.283166 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.286071 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.286052 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 02:46:03.294388 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.294366 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qf6p4"] Apr 21 02:46:03.342995 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.342972 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6826d476-dc74-4776-a925-ed13337326a0-original-pull-secret\") pod \"global-pull-secret-syncer-qf6p4\" (UID: \"6826d476-dc74-4776-a925-ed13337326a0\") " pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.343108 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.343008 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6826d476-dc74-4776-a925-ed13337326a0-dbus\") pod \"global-pull-secret-syncer-qf6p4\" (UID: \"6826d476-dc74-4776-a925-ed13337326a0\") " pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.343108 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.343032 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6826d476-dc74-4776-a925-ed13337326a0-kubelet-config\") pod \"global-pull-secret-syncer-qf6p4\" (UID: \"6826d476-dc74-4776-a925-ed13337326a0\") " pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.444298 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.444266 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6826d476-dc74-4776-a925-ed13337326a0-original-pull-secret\") pod \"global-pull-secret-syncer-qf6p4\" (UID: \"6826d476-dc74-4776-a925-ed13337326a0\") " pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.444411 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.444313 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6826d476-dc74-4776-a925-ed13337326a0-dbus\") pod \"global-pull-secret-syncer-qf6p4\" (UID: \"6826d476-dc74-4776-a925-ed13337326a0\") " pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.444411 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.444346 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6826d476-dc74-4776-a925-ed13337326a0-kubelet-config\") pod \"global-pull-secret-syncer-qf6p4\" (UID: \"6826d476-dc74-4776-a925-ed13337326a0\") " pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.444529 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.444430 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6826d476-dc74-4776-a925-ed13337326a0-kubelet-config\") pod \"global-pull-secret-syncer-qf6p4\" (UID: \"6826d476-dc74-4776-a925-ed13337326a0\") " pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.444579 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.444526 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6826d476-dc74-4776-a925-ed13337326a0-dbus\") pod \"global-pull-secret-syncer-qf6p4\" (UID: \"6826d476-dc74-4776-a925-ed13337326a0\") " pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.446524 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.446493 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6826d476-dc74-4776-a925-ed13337326a0-original-pull-secret\") pod \"global-pull-secret-syncer-qf6p4\" (UID: \"6826d476-dc74-4776-a925-ed13337326a0\") " pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.592892 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.592829 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qf6p4" Apr 21 02:46:03.713956 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.713924 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qf6p4"] Apr 21 02:46:03.716511 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:46:03.716473 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6826d476_dc74_4776_a925_ed13337326a0.slice/crio-46c0f9a276cc3ac7f2b93511e25dbf41c66aecb392586f6b9c55c586ce27c1af WatchSource:0}: Error finding container 46c0f9a276cc3ac7f2b93511e25dbf41c66aecb392586f6b9c55c586ce27c1af: Status 404 returned error can't find the container with id 46c0f9a276cc3ac7f2b93511e25dbf41c66aecb392586f6b9c55c586ce27c1af Apr 21 02:46:03.855263 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:03.855197 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qf6p4" event={"ID":"6826d476-dc74-4776-a925-ed13337326a0","Type":"ContainerStarted","Data":"46c0f9a276cc3ac7f2b93511e25dbf41c66aecb392586f6b9c55c586ce27c1af"} Apr 21 02:46:07.853377 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:07.853251 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:46:07.863219 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:07.853450 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:46:08.875398 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:08.875355 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qf6p4" event={"ID":"6826d476-dc74-4776-a925-ed13337326a0","Type":"ContainerStarted","Data":"6dd06095466e51e3ebb83f2e96b56cd197343de62e55d6db73fc50a0443bc0c5"} Apr 21 02:46:08.891682 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:08.891630 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qf6p4" podStartSLOduration=1.714370307 podStartE2EDuration="5.891616057s" podCreationTimestamp="2026-04-21 02:46:03 +0000 UTC" firstStartedPulling="2026-04-21 02:46:03.718091081 +0000 UTC m=+296.485579748" lastFinishedPulling="2026-04-21 02:46:07.895336822 +0000 UTC m=+300.662825498" observedRunningTime="2026-04-21 02:46:08.890442199 +0000 UTC m=+301.657930885" watchObservedRunningTime="2026-04-21 02:46:08.891616057 +0000 UTC m=+301.659104745" Apr 21 02:46:15.834914 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:15.834856 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56d76df976-zd82c" podUID="8a0e0820-a088-4035-a18d-587bd27316fe" containerName="console" containerID="cri-o://73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2" gracePeriod=15 Apr 21 02:46:16.068941 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.068920 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d76df976-zd82c_8a0e0820-a088-4035-a18d-587bd27316fe/console/0.log" Apr 21 02:46:16.069043 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.068980 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:46:16.237720 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.237654 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-oauth-serving-cert\") pod \"8a0e0820-a088-4035-a18d-587bd27316fe\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " Apr 21 02:46:16.237720 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.237696 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pw74\" (UniqueName: \"kubernetes.io/projected/8a0e0820-a088-4035-a18d-587bd27316fe-kube-api-access-2pw74\") pod \"8a0e0820-a088-4035-a18d-587bd27316fe\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " Apr 21 02:46:16.237720 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.237713 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-trusted-ca-bundle\") pod \"8a0e0820-a088-4035-a18d-587bd27316fe\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " Apr 21 02:46:16.238011 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.237747 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-serving-cert\") pod \"8a0e0820-a088-4035-a18d-587bd27316fe\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " Apr 21 02:46:16.238011 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.237763 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-service-ca\") pod \"8a0e0820-a088-4035-a18d-587bd27316fe\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " Apr 21 02:46:16.238011 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.237796 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-console-config\") pod \"8a0e0820-a088-4035-a18d-587bd27316fe\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " Apr 21 02:46:16.238011 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.237847 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-oauth-config\") pod \"8a0e0820-a088-4035-a18d-587bd27316fe\" (UID: \"8a0e0820-a088-4035-a18d-587bd27316fe\") " Apr 21 02:46:16.238267 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.238234 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-console-config" (OuterVolumeSpecName: "console-config") pod "8a0e0820-a088-4035-a18d-587bd27316fe" (UID: "8a0e0820-a088-4035-a18d-587bd27316fe"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:46:16.238334 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.238243 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-service-ca" (OuterVolumeSpecName: "service-ca") pod "8a0e0820-a088-4035-a18d-587bd27316fe" (UID: "8a0e0820-a088-4035-a18d-587bd27316fe"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:46:16.238334 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.238293 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8a0e0820-a088-4035-a18d-587bd27316fe" (UID: "8a0e0820-a088-4035-a18d-587bd27316fe"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:46:16.238429 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.238409 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8a0e0820-a088-4035-a18d-587bd27316fe" (UID: "8a0e0820-a088-4035-a18d-587bd27316fe"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:46:16.240038 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.240005 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8a0e0820-a088-4035-a18d-587bd27316fe" (UID: "8a0e0820-a088-4035-a18d-587bd27316fe"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:46:16.240202 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.240181 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0e0820-a088-4035-a18d-587bd27316fe-kube-api-access-2pw74" (OuterVolumeSpecName: "kube-api-access-2pw74") pod "8a0e0820-a088-4035-a18d-587bd27316fe" (UID: "8a0e0820-a088-4035-a18d-587bd27316fe"). InnerVolumeSpecName "kube-api-access-2pw74". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:46:16.240543 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.240518 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8a0e0820-a088-4035-a18d-587bd27316fe" (UID: "8a0e0820-a088-4035-a18d-587bd27316fe"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:46:16.338353 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.338327 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-oauth-config\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:16.338353 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.338351 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-oauth-serving-cert\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:16.338519 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.338365 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pw74\" (UniqueName: \"kubernetes.io/projected/8a0e0820-a088-4035-a18d-587bd27316fe-kube-api-access-2pw74\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:16.338519 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.338380 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-trusted-ca-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:16.338519 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.338393 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a0e0820-a088-4035-a18d-587bd27316fe-console-serving-cert\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:16.338519 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.338407 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-service-ca\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:16.338519 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.338422 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a0e0820-a088-4035-a18d-587bd27316fe-console-config\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:16.726601 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.726576 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw"] Apr 21 02:46:16.726886 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.726875 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a0e0820-a088-4035-a18d-587bd27316fe" containerName="console" Apr 21 02:46:16.726947 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.726888 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0e0820-a088-4035-a18d-587bd27316fe" containerName="console" Apr 21 02:46:16.726984 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.726965 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a0e0820-a088-4035-a18d-587bd27316fe" containerName="console" Apr 21 02:46:16.731171 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.731154 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:16.733592 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.733572 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 02:46:16.734665 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.734638 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 02:46:16.734778 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.734664 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ssj29\"" Apr 21 02:46:16.737139 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.737097 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw"] Apr 21 02:46:16.841581 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.841549 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:16.841581 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.841590 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb2bw\" (UniqueName: \"kubernetes.io/projected/6cca8af8-1856-49d9-92df-58a419a99696-kube-api-access-tb2bw\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:16.841905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.841614 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:16.899200 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.899182 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d76df976-zd82c_8a0e0820-a088-4035-a18d-587bd27316fe/console/0.log" Apr 21 02:46:16.899309 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.899218 2564 generic.go:358] "Generic (PLEG): container finished" podID="8a0e0820-a088-4035-a18d-587bd27316fe" containerID="73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2" exitCode=2 Apr 21 02:46:16.899309 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.899250 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d76df976-zd82c" event={"ID":"8a0e0820-a088-4035-a18d-587bd27316fe","Type":"ContainerDied","Data":"73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2"} Apr 21 02:46:16.899309 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.899273 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d76df976-zd82c" Apr 21 02:46:16.899309 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.899291 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d76df976-zd82c" event={"ID":"8a0e0820-a088-4035-a18d-587bd27316fe","Type":"ContainerDied","Data":"1ff56f757b669fb1009d12344c9d01a2c68711ae53c93eeb35eed87cea32616f"} Apr 21 02:46:16.899528 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.899311 2564 scope.go:117] "RemoveContainer" containerID="73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2" Apr 21 02:46:16.907667 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.907651 2564 scope.go:117] "RemoveContainer" containerID="73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2" Apr 21 02:46:16.907894 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:46:16.907878 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2\": container with ID starting with 73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2 not found: ID does not exist" containerID="73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2" Apr 21 02:46:16.907965 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.907899 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2"} err="failed to get container status \"73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2\": rpc error: code = NotFound desc = could not find container \"73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2\": container with ID starting with 73ba9163505b682cfe5f0637e83a948aca113f0381cbb66eaea7d32477ada4c2 not found: ID does not exist" Apr 21 02:46:16.920691 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.920666 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d76df976-zd82c"] Apr 21 02:46:16.925056 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.924785 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56d76df976-zd82c"] Apr 21 02:46:16.942899 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.942876 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:16.942960 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.942902 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tb2bw\" (UniqueName: \"kubernetes.io/projected/6cca8af8-1856-49d9-92df-58a419a99696-kube-api-access-tb2bw\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:16.942960 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.942939 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:16.943341 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.943315 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:16.943405 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.943362 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:16.950578 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:16.950555 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb2bw\" (UniqueName: \"kubernetes.io/projected/6cca8af8-1856-49d9-92df-58a419a99696-kube-api-access-tb2bw\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:17.041457 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:17.041434 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:17.158229 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:17.158206 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw"] Apr 21 02:46:17.160814 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:46:17.160787 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cca8af8_1856_49d9_92df_58a419a99696.slice/crio-7489d02bd71fb2b252873b8e016fe4e2c5061fd86a6893af053669a3b4d5502f WatchSource:0}: Error finding container 7489d02bd71fb2b252873b8e016fe4e2c5061fd86a6893af053669a3b4d5502f: Status 404 returned error can't find the container with id 7489d02bd71fb2b252873b8e016fe4e2c5061fd86a6893af053669a3b4d5502f Apr 21 02:46:17.162655 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:17.162641 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:46:17.845843 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:17.845776 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0e0820-a088-4035-a18d-587bd27316fe" path="/var/lib/kubelet/pods/8a0e0820-a088-4035-a18d-587bd27316fe/volumes" Apr 21 02:46:17.907312 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:17.907272 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" event={"ID":"6cca8af8-1856-49d9-92df-58a419a99696","Type":"ContainerStarted","Data":"7489d02bd71fb2b252873b8e016fe4e2c5061fd86a6893af053669a3b4d5502f"} Apr 21 02:46:22.924625 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:22.924587 2564 generic.go:358] "Generic (PLEG): container finished" podID="6cca8af8-1856-49d9-92df-58a419a99696" containerID="8d012ef1e86516d304f8be04ab18591ddcca25659b1793baaed04b49ab615d2e" exitCode=0 Apr 21 02:46:22.925023 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:22.924678 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" event={"ID":"6cca8af8-1856-49d9-92df-58a419a99696","Type":"ContainerDied","Data":"8d012ef1e86516d304f8be04ab18591ddcca25659b1793baaed04b49ab615d2e"} Apr 21 02:46:25.934304 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:25.934268 2564 generic.go:358] "Generic (PLEG): container finished" podID="6cca8af8-1856-49d9-92df-58a419a99696" containerID="8286c89150c00c55ecaeac4b250c783ce64f51f9b84249e078c438864153f67f" exitCode=0 Apr 21 02:46:25.934758 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:25.934335 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" event={"ID":"6cca8af8-1856-49d9-92df-58a419a99696","Type":"ContainerDied","Data":"8286c89150c00c55ecaeac4b250c783ce64f51f9b84249e078c438864153f67f"} Apr 21 02:46:32.955993 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:32.955959 2564 generic.go:358] "Generic (PLEG): container finished" podID="6cca8af8-1856-49d9-92df-58a419a99696" containerID="c7187649276bfa568f07f4d2fcd3fee6b0442dc79331f441c8ace2595427bf07" exitCode=0 Apr 21 02:46:32.956440 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:32.956014 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" event={"ID":"6cca8af8-1856-49d9-92df-58a419a99696","Type":"ContainerDied","Data":"c7187649276bfa568f07f4d2fcd3fee6b0442dc79331f441c8ace2595427bf07"} Apr 21 02:46:34.081321 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.081300 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:34.176594 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.176562 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-util\") pod \"6cca8af8-1856-49d9-92df-58a419a99696\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " Apr 21 02:46:34.176746 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.176607 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-bundle\") pod \"6cca8af8-1856-49d9-92df-58a419a99696\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " Apr 21 02:46:34.176746 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.176634 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb2bw\" (UniqueName: \"kubernetes.io/projected/6cca8af8-1856-49d9-92df-58a419a99696-kube-api-access-tb2bw\") pod \"6cca8af8-1856-49d9-92df-58a419a99696\" (UID: \"6cca8af8-1856-49d9-92df-58a419a99696\") " Apr 21 02:46:34.177303 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.177277 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-bundle" (OuterVolumeSpecName: "bundle") pod "6cca8af8-1856-49d9-92df-58a419a99696" (UID: "6cca8af8-1856-49d9-92df-58a419a99696"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:46:34.178824 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.178799 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cca8af8-1856-49d9-92df-58a419a99696-kube-api-access-tb2bw" (OuterVolumeSpecName: "kube-api-access-tb2bw") pod "6cca8af8-1856-49d9-92df-58a419a99696" (UID: "6cca8af8-1856-49d9-92df-58a419a99696"). InnerVolumeSpecName "kube-api-access-tb2bw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:46:34.180487 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.180466 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-util" (OuterVolumeSpecName: "util") pod "6cca8af8-1856-49d9-92df-58a419a99696" (UID: "6cca8af8-1856-49d9-92df-58a419a99696"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:46:34.277715 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.277693 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:34.277715 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.277714 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cca8af8-1856-49d9-92df-58a419a99696-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:34.277854 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.277724 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tb2bw\" (UniqueName: \"kubernetes.io/projected/6cca8af8-1856-49d9-92df-58a419a99696-kube-api-access-tb2bw\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:34.961948 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.961928 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" Apr 21 02:46:34.962099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.961919 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2k9xw" event={"ID":"6cca8af8-1856-49d9-92df-58a419a99696","Type":"ContainerDied","Data":"7489d02bd71fb2b252873b8e016fe4e2c5061fd86a6893af053669a3b4d5502f"} Apr 21 02:46:34.962099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:34.962039 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7489d02bd71fb2b252873b8e016fe4e2c5061fd86a6893af053669a3b4d5502f" Apr 21 02:46:39.445030 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.444995 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k"] Apr 21 02:46:39.445484 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.445282 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cca8af8-1856-49d9-92df-58a419a99696" containerName="util" Apr 21 02:46:39.445484 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.445293 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cca8af8-1856-49d9-92df-58a419a99696" containerName="util" Apr 21 02:46:39.445484 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.445317 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cca8af8-1856-49d9-92df-58a419a99696" containerName="extract" Apr 21 02:46:39.445484 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.445322 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cca8af8-1856-49d9-92df-58a419a99696" containerName="extract" Apr 21 02:46:39.445484 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.445333 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cca8af8-1856-49d9-92df-58a419a99696" containerName="pull" Apr 21 02:46:39.445484 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.445338 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cca8af8-1856-49d9-92df-58a419a99696" containerName="pull" Apr 21 02:46:39.445484 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.445383 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cca8af8-1856-49d9-92df-58a419a99696" containerName="extract" Apr 21 02:46:39.492601 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.492572 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k"] Apr 21 02:46:39.492722 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.492672 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" Apr 21 02:46:39.495191 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.495163 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 02:46:39.495314 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.495297 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-b4kjg\"" Apr 21 02:46:39.495366 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.495354 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:46:39.514735 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.514713 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4g7q\" (UniqueName: \"kubernetes.io/projected/e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26-kube-api-access-p4g7q\") pod \"cert-manager-operator-controller-manager-54b9655956-mk25k\" (UID: \"e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" Apr 21 02:46:39.514830 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.514751 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-mk25k\" (UID: \"e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" Apr 21 02:46:39.615624 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.615600 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4g7q\" (UniqueName: \"kubernetes.io/projected/e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26-kube-api-access-p4g7q\") pod \"cert-manager-operator-controller-manager-54b9655956-mk25k\" (UID: \"e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" Apr 21 02:46:39.615715 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.615642 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-mk25k\" (UID: \"e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" Apr 21 02:46:39.615950 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.615935 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-mk25k\" (UID: \"e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" Apr 21 02:46:39.623845 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.623826 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4g7q\" (UniqueName: \"kubernetes.io/projected/e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26-kube-api-access-p4g7q\") pod \"cert-manager-operator-controller-manager-54b9655956-mk25k\" (UID: \"e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" Apr 21 02:46:39.802115 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.802092 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" Apr 21 02:46:39.924532 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.924487 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k"] Apr 21 02:46:39.927921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:46:39.927891 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8563a8a_69bb_4ad5_a2e6_97e0c2ba5a26.slice/crio-552d2f488bc4b86357f87ef9a5bc578e2c501945028b205ff9f775e7b70684ff WatchSource:0}: Error finding container 552d2f488bc4b86357f87ef9a5bc578e2c501945028b205ff9f775e7b70684ff: Status 404 returned error can't find the container with id 552d2f488bc4b86357f87ef9a5bc578e2c501945028b205ff9f775e7b70684ff Apr 21 02:46:39.983694 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:39.983667 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" event={"ID":"e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26","Type":"ContainerStarted","Data":"552d2f488bc4b86357f87ef9a5bc578e2c501945028b205ff9f775e7b70684ff"} Apr 21 02:46:41.992521 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:41.992464 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" event={"ID":"e8563a8a-69bb-4ad5-a2e6-97e0c2ba5a26","Type":"ContainerStarted","Data":"886717bc2c3b320a5816b4d3b5dbef0684bc144f302ab292facf309588ffeb09"} Apr 21 02:46:42.026325 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:42.026268 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mk25k" podStartSLOduration=1.531335716 podStartE2EDuration="3.026252991s" podCreationTimestamp="2026-04-21 02:46:39 +0000 UTC" firstStartedPulling="2026-04-21 02:46:39.930400603 +0000 UTC m=+332.697889270" lastFinishedPulling="2026-04-21 02:46:41.425317877 +0000 UTC m=+334.192806545" observedRunningTime="2026-04-21 02:46:42.023065652 +0000 UTC m=+334.790554340" watchObservedRunningTime="2026-04-21 02:46:42.026252991 +0000 UTC m=+334.793741680" Apr 21 02:46:43.720129 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.720094 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8"] Apr 21 02:46:43.723779 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.723760 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:43.726085 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.726063 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 02:46:43.726085 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.726079 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ssj29\"" Apr 21 02:46:43.726861 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.726841 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 02:46:43.731154 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.730777 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8"] Apr 21 02:46:43.745079 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.745048 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spwdg\" (UniqueName: \"kubernetes.io/projected/635b3ac9-dca5-40bd-992f-ce6eabc0b656-kube-api-access-spwdg\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:43.745198 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.745112 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:43.745198 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.745149 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:43.846263 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.846233 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spwdg\" (UniqueName: \"kubernetes.io/projected/635b3ac9-dca5-40bd-992f-ce6eabc0b656-kube-api-access-spwdg\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:43.846420 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.846275 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:43.846420 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.846308 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:43.846738 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.846633 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:43.846738 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.846679 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:43.853835 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:43.853815 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spwdg\" (UniqueName: \"kubernetes.io/projected/635b3ac9-dca5-40bd-992f-ce6eabc0b656-kube-api-access-spwdg\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:44.034398 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.034375 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:44.151440 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.151365 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8"] Apr 21 02:46:44.153451 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:46:44.153424 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod635b3ac9_dca5_40bd_992f_ce6eabc0b656.slice/crio-8c5a72b5f83e72ba4221ab3dd6c8bd40463735f0504c6124f7bfe8761f9174c0 WatchSource:0}: Error finding container 8c5a72b5f83e72ba4221ab3dd6c8bd40463735f0504c6124f7bfe8761f9174c0: Status 404 returned error can't find the container with id 8c5a72b5f83e72ba4221ab3dd6c8bd40463735f0504c6124f7bfe8761f9174c0 Apr 21 02:46:44.805131 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.805102 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-js9lk"] Apr 21 02:46:44.807985 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.807966 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:46:44.810384 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.810366 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 02:46:44.810485 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.810405 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 02:46:44.811037 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.811020 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-ph684\"" Apr 21 02:46:44.814884 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.814862 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-js9lk"] Apr 21 02:46:44.854975 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.854944 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d59tl\" (UniqueName: \"kubernetes.io/projected/b34640aa-4c52-406f-a484-3c8fa331d4ef-kube-api-access-d59tl\") pod \"cert-manager-webhook-587ccfb98-js9lk\" (UID: \"b34640aa-4c52-406f-a484-3c8fa331d4ef\") " pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:46:44.855064 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.855042 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b34640aa-4c52-406f-a484-3c8fa331d4ef-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-js9lk\" (UID: \"b34640aa-4c52-406f-a484-3c8fa331d4ef\") " pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:46:44.956342 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.956318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d59tl\" (UniqueName: \"kubernetes.io/projected/b34640aa-4c52-406f-a484-3c8fa331d4ef-kube-api-access-d59tl\") pod \"cert-manager-webhook-587ccfb98-js9lk\" (UID: \"b34640aa-4c52-406f-a484-3c8fa331d4ef\") " pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:46:44.956427 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.956362 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b34640aa-4c52-406f-a484-3c8fa331d4ef-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-js9lk\" (UID: \"b34640aa-4c52-406f-a484-3c8fa331d4ef\") " pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:46:44.964461 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.964436 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b34640aa-4c52-406f-a484-3c8fa331d4ef-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-js9lk\" (UID: \"b34640aa-4c52-406f-a484-3c8fa331d4ef\") " pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:46:44.964583 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:44.964568 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d59tl\" (UniqueName: \"kubernetes.io/projected/b34640aa-4c52-406f-a484-3c8fa331d4ef-kube-api-access-d59tl\") pod \"cert-manager-webhook-587ccfb98-js9lk\" (UID: \"b34640aa-4c52-406f-a484-3c8fa331d4ef\") " pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:46:45.005052 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:45.005029 2564 generic.go:358] "Generic (PLEG): container finished" podID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerID="adb58b084bc630d639f5d9b112bb8f375bbde0848a3a7d7b4e32ac9dcf2b2395" exitCode=0 Apr 21 02:46:45.005151 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:45.005064 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" event={"ID":"635b3ac9-dca5-40bd-992f-ce6eabc0b656","Type":"ContainerDied","Data":"adb58b084bc630d639f5d9b112bb8f375bbde0848a3a7d7b4e32ac9dcf2b2395"} Apr 21 02:46:45.005151 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:45.005089 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" event={"ID":"635b3ac9-dca5-40bd-992f-ce6eabc0b656","Type":"ContainerStarted","Data":"8c5a72b5f83e72ba4221ab3dd6c8bd40463735f0504c6124f7bfe8761f9174c0"} Apr 21 02:46:45.135104 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:45.135040 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:46:45.252740 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:45.252719 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-js9lk"] Apr 21 02:46:45.255167 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:46:45.255139 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb34640aa_4c52_406f_a484_3c8fa331d4ef.slice/crio-4062c3cef7a5fb4652e8720d53ce790c02cbafa4b8261463342440324d521d69 WatchSource:0}: Error finding container 4062c3cef7a5fb4652e8720d53ce790c02cbafa4b8261463342440324d521d69: Status 404 returned error can't find the container with id 4062c3cef7a5fb4652e8720d53ce790c02cbafa4b8261463342440324d521d69 Apr 21 02:46:46.010632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:46.010583 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" event={"ID":"b34640aa-4c52-406f-a484-3c8fa331d4ef","Type":"ContainerStarted","Data":"4062c3cef7a5fb4652e8720d53ce790c02cbafa4b8261463342440324d521d69"} Apr 21 02:46:47.639252 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.639223 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-f9vrs"] Apr 21 02:46:47.642819 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.642796 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" Apr 21 02:46:47.645629 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.645597 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-z79wv\"" Apr 21 02:46:47.653514 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.653472 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-f9vrs"] Apr 21 02:46:47.677744 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.677718 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd4adff6-05f1-43de-8794-2c8abffd8714-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-f9vrs\" (UID: \"bd4adff6-05f1-43de-8794-2c8abffd8714\") " pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" Apr 21 02:46:47.677864 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.677842 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdd7\" (UniqueName: \"kubernetes.io/projected/bd4adff6-05f1-43de-8794-2c8abffd8714-kube-api-access-2qdd7\") pod \"cert-manager-cainjector-68b757865b-f9vrs\" (UID: \"bd4adff6-05f1-43de-8794-2c8abffd8714\") " pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" Apr 21 02:46:47.779270 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.779236 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdd7\" (UniqueName: \"kubernetes.io/projected/bd4adff6-05f1-43de-8794-2c8abffd8714-kube-api-access-2qdd7\") pod \"cert-manager-cainjector-68b757865b-f9vrs\" (UID: \"bd4adff6-05f1-43de-8794-2c8abffd8714\") " pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" Apr 21 02:46:47.779425 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.779302 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd4adff6-05f1-43de-8794-2c8abffd8714-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-f9vrs\" (UID: \"bd4adff6-05f1-43de-8794-2c8abffd8714\") " pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" Apr 21 02:46:47.787428 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.787402 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd4adff6-05f1-43de-8794-2c8abffd8714-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-f9vrs\" (UID: \"bd4adff6-05f1-43de-8794-2c8abffd8714\") " pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" Apr 21 02:46:47.787688 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.787666 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdd7\" (UniqueName: \"kubernetes.io/projected/bd4adff6-05f1-43de-8794-2c8abffd8714-kube-api-access-2qdd7\") pod \"cert-manager-cainjector-68b757865b-f9vrs\" (UID: \"bd4adff6-05f1-43de-8794-2c8abffd8714\") " pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" Apr 21 02:46:47.953905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:47.953879 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" Apr 21 02:46:48.036864 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:48.036009 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" event={"ID":"635b3ac9-dca5-40bd-992f-ce6eabc0b656","Type":"ContainerStarted","Data":"e3ba398f694a74c49b5f7bf6e7ae5275151d91a2c2501c2b5fc43da1ebd23a5a"} Apr 21 02:46:48.038471 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:48.038433 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" event={"ID":"b34640aa-4c52-406f-a484-3c8fa331d4ef","Type":"ContainerStarted","Data":"0e476377b1675d3f80af27dce0f5551e3b2da9810836e87aa37f6d01de11ed70"} Apr 21 02:46:48.038956 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:48.038936 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:46:48.067090 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:48.066870 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" podStartSLOduration=1.397147996 podStartE2EDuration="4.06685362s" podCreationTimestamp="2026-04-21 02:46:44 +0000 UTC" firstStartedPulling="2026-04-21 02:46:45.256914521 +0000 UTC m=+338.024403188" lastFinishedPulling="2026-04-21 02:46:47.926620131 +0000 UTC m=+340.694108812" observedRunningTime="2026-04-21 02:46:48.066195083 +0000 UTC m=+340.833683775" watchObservedRunningTime="2026-04-21 02:46:48.06685362 +0000 UTC m=+340.834342310" Apr 21 02:46:48.116984 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:48.116959 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-f9vrs"] Apr 21 02:46:48.131280 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:46:48.131253 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4adff6_05f1_43de_8794_2c8abffd8714.slice/crio-62a06a5e0e5810d6513e4dae26500a1ed650513151d17dc5316b34f7613b7ee0 WatchSource:0}: Error finding container 62a06a5e0e5810d6513e4dae26500a1ed650513151d17dc5316b34f7613b7ee0: Status 404 returned error can't find the container with id 62a06a5e0e5810d6513e4dae26500a1ed650513151d17dc5316b34f7613b7ee0 Apr 21 02:46:49.042552 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:49.042513 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" event={"ID":"bd4adff6-05f1-43de-8794-2c8abffd8714","Type":"ContainerStarted","Data":"961fc7b875c5f887d6dc74f6c0ab5fc237bde03cd461a7727f6bb5836eea17d1"} Apr 21 02:46:49.042552 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:49.042554 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" event={"ID":"bd4adff6-05f1-43de-8794-2c8abffd8714","Type":"ContainerStarted","Data":"62a06a5e0e5810d6513e4dae26500a1ed650513151d17dc5316b34f7613b7ee0"} Apr 21 02:46:49.044066 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:49.044006 2564 generic.go:358] "Generic (PLEG): container finished" podID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerID="e3ba398f694a74c49b5f7bf6e7ae5275151d91a2c2501c2b5fc43da1ebd23a5a" exitCode=0 Apr 21 02:46:49.044149 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:49.044090 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" event={"ID":"635b3ac9-dca5-40bd-992f-ce6eabc0b656","Type":"ContainerDied","Data":"e3ba398f694a74c49b5f7bf6e7ae5275151d91a2c2501c2b5fc43da1ebd23a5a"} Apr 21 02:46:49.075908 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:49.075870 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-f9vrs" podStartSLOduration=2.075853876 podStartE2EDuration="2.075853876s" podCreationTimestamp="2026-04-21 02:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:46:49.075125783 +0000 UTC m=+341.842614470" watchObservedRunningTime="2026-04-21 02:46:49.075853876 +0000 UTC m=+341.843342566" Apr 21 02:46:50.048856 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:50.048817 2564 generic.go:358] "Generic (PLEG): container finished" podID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerID="86f6469b5a3f363892037da36c162ad04f82af19a44994efbddc302890241e49" exitCode=0 Apr 21 02:46:50.049278 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:50.048897 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" event={"ID":"635b3ac9-dca5-40bd-992f-ce6eabc0b656","Type":"ContainerDied","Data":"86f6469b5a3f363892037da36c162ad04f82af19a44994efbddc302890241e49"} Apr 21 02:46:51.174722 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.174686 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:51.209054 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.209031 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-util\") pod \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " Apr 21 02:46:51.209180 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.209106 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-bundle\") pod \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " Apr 21 02:46:51.209180 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.209133 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spwdg\" (UniqueName: \"kubernetes.io/projected/635b3ac9-dca5-40bd-992f-ce6eabc0b656-kube-api-access-spwdg\") pod \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\" (UID: \"635b3ac9-dca5-40bd-992f-ce6eabc0b656\") " Apr 21 02:46:51.209511 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.209467 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-bundle" (OuterVolumeSpecName: "bundle") pod "635b3ac9-dca5-40bd-992f-ce6eabc0b656" (UID: "635b3ac9-dca5-40bd-992f-ce6eabc0b656"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:46:51.211232 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.211204 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635b3ac9-dca5-40bd-992f-ce6eabc0b656-kube-api-access-spwdg" (OuterVolumeSpecName: "kube-api-access-spwdg") pod "635b3ac9-dca5-40bd-992f-ce6eabc0b656" (UID: "635b3ac9-dca5-40bd-992f-ce6eabc0b656"). InnerVolumeSpecName "kube-api-access-spwdg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:46:51.214011 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.213982 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-util" (OuterVolumeSpecName: "util") pod "635b3ac9-dca5-40bd-992f-ce6eabc0b656" (UID: "635b3ac9-dca5-40bd-992f-ce6eabc0b656"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:46:51.309791 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.309740 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:51.309791 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.309761 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-spwdg\" (UniqueName: \"kubernetes.io/projected/635b3ac9-dca5-40bd-992f-ce6eabc0b656-kube-api-access-spwdg\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:51.309791 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:51.309772 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635b3ac9-dca5-40bd-992f-ce6eabc0b656-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:46:52.058233 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:52.058163 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" event={"ID":"635b3ac9-dca5-40bd-992f-ce6eabc0b656","Type":"ContainerDied","Data":"8c5a72b5f83e72ba4221ab3dd6c8bd40463735f0504c6124f7bfe8761f9174c0"} Apr 21 02:46:52.058233 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:52.058197 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c5a72b5f83e72ba4221ab3dd6c8bd40463735f0504c6124f7bfe8761f9174c0" Apr 21 02:46:52.058233 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:52.058206 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5lpk8" Apr 21 02:46:55.051766 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:46:55.051737 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-js9lk" Apr 21 02:47:00.914631 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.914597 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7"] Apr 21 02:47:00.915082 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.914908 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerName="pull" Apr 21 02:47:00.915082 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.914918 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerName="pull" Apr 21 02:47:00.915082 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.914934 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerName="extract" Apr 21 02:47:00.915082 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.914941 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerName="extract" Apr 21 02:47:00.915082 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.914951 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerName="util" Apr 21 02:47:00.915082 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.914956 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerName="util" Apr 21 02:47:00.915082 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.915004 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="635b3ac9-dca5-40bd-992f-ce6eabc0b656" containerName="extract" Apr 21 02:47:00.936075 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.936049 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7"] Apr 21 02:47:00.936218 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.936143 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" Apr 21 02:47:00.938888 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.938870 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 02:47:00.939764 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.939743 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:47:00.939764 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.939754 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-448mf\"" Apr 21 02:47:00.980358 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.980332 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7cwj\" (UniqueName: \"kubernetes.io/projected/d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669-kube-api-access-q7cwj\") pod \"openshift-lws-operator-bfc7f696d-86vw7\" (UID: \"d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" Apr 21 02:47:00.980450 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:00.980375 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669-tmp\") pod \"openshift-lws-operator-bfc7f696d-86vw7\" (UID: \"d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" Apr 21 02:47:01.081325 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:01.081288 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669-tmp\") pod \"openshift-lws-operator-bfc7f696d-86vw7\" (UID: \"d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" Apr 21 02:47:01.081568 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:01.081371 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7cwj\" (UniqueName: \"kubernetes.io/projected/d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669-kube-api-access-q7cwj\") pod \"openshift-lws-operator-bfc7f696d-86vw7\" (UID: \"d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" Apr 21 02:47:01.081707 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:01.081687 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669-tmp\") pod \"openshift-lws-operator-bfc7f696d-86vw7\" (UID: \"d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" Apr 21 02:47:01.090370 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:01.090341 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7cwj\" (UniqueName: \"kubernetes.io/projected/d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669-kube-api-access-q7cwj\") pod \"openshift-lws-operator-bfc7f696d-86vw7\" (UID: \"d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" Apr 21 02:47:01.245601 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:01.245522 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" Apr 21 02:47:01.363811 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:01.363788 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7"] Apr 21 02:47:01.365913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:47:01.365888 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd957ce2d_1bc6_4ca8_b1bd_fbb0d792a669.slice/crio-fc0a6630555f85456be9fc6774864e6437b073ee400c3616ef324f2fad8279c1 WatchSource:0}: Error finding container fc0a6630555f85456be9fc6774864e6437b073ee400c3616ef324f2fad8279c1: Status 404 returned error can't find the container with id fc0a6630555f85456be9fc6774864e6437b073ee400c3616ef324f2fad8279c1 Apr 21 02:47:02.093959 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:02.093901 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" event={"ID":"d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669","Type":"ContainerStarted","Data":"fc0a6630555f85456be9fc6774864e6437b073ee400c3616ef324f2fad8279c1"} Apr 21 02:47:04.101948 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:04.101910 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" event={"ID":"d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669","Type":"ContainerStarted","Data":"efe6b03a194b4046672d5ee6e0eb497b62b1afbde220a44e79f083df6544c119"} Apr 21 02:47:04.117058 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:04.117006 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-86vw7" podStartSLOduration=1.9908847939999998 podStartE2EDuration="4.116990068s" podCreationTimestamp="2026-04-21 02:47:00 +0000 UTC" firstStartedPulling="2026-04-21 02:47:01.36738504 +0000 UTC m=+354.134873708" lastFinishedPulling="2026-04-21 02:47:03.493490315 +0000 UTC m=+356.260978982" observedRunningTime="2026-04-21 02:47:04.116595775 +0000 UTC m=+356.884084465" watchObservedRunningTime="2026-04-21 02:47:04.116990068 +0000 UTC m=+356.884478758" Apr 21 02:47:08.139231 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.139202 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc"] Apr 21 02:47:08.144362 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.144345 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.146747 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.146727 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 02:47:08.147867 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.147688 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 02:47:08.147867 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.147731 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ssj29\"" Apr 21 02:47:08.149875 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.149850 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc"] Apr 21 02:47:08.239612 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.239586 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.239743 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.239616 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdcq9\" (UniqueName: \"kubernetes.io/projected/4e3ef58c-5480-4e1e-bc57-5a27066e9963-kube-api-access-pdcq9\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.239743 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.239665 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.340272 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.340243 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.340375 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.340300 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.340375 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.340336 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdcq9\" (UniqueName: \"kubernetes.io/projected/4e3ef58c-5480-4e1e-bc57-5a27066e9963-kube-api-access-pdcq9\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.340747 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.340727 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.340832 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.340774 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.349203 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.349173 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdcq9\" (UniqueName: \"kubernetes.io/projected/4e3ef58c-5480-4e1e-bc57-5a27066e9963-kube-api-access-pdcq9\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.454244 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.454184 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:08.581169 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:08.581141 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc"] Apr 21 02:47:08.582940 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:47:08.582915 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e3ef58c_5480_4e1e_bc57_5a27066e9963.slice/crio-f57d8c1a2433a996f39448130db5adfb78ba8418267638120cf9d53237096b07 WatchSource:0}: Error finding container f57d8c1a2433a996f39448130db5adfb78ba8418267638120cf9d53237096b07: Status 404 returned error can't find the container with id f57d8c1a2433a996f39448130db5adfb78ba8418267638120cf9d53237096b07 Apr 21 02:47:09.119269 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:09.119240 2564 generic.go:358] "Generic (PLEG): container finished" podID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerID="90f33686c2e76aa1cc816299bacb132e2be7e38113b959a1206162adb8c92374" exitCode=0 Apr 21 02:47:09.119401 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:09.119297 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" event={"ID":"4e3ef58c-5480-4e1e-bc57-5a27066e9963","Type":"ContainerDied","Data":"90f33686c2e76aa1cc816299bacb132e2be7e38113b959a1206162adb8c92374"} Apr 21 02:47:09.119401 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:09.119324 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" event={"ID":"4e3ef58c-5480-4e1e-bc57-5a27066e9963","Type":"ContainerStarted","Data":"f57d8c1a2433a996f39448130db5adfb78ba8418267638120cf9d53237096b07"} Apr 21 02:47:10.123622 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:10.123588 2564 generic.go:358] "Generic (PLEG): container finished" podID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerID="7aaa9e670eba1533d7f59abe421f39726a806c55c2dafaba2040cade32c1d0a8" exitCode=0 Apr 21 02:47:10.123966 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:10.123665 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" event={"ID":"4e3ef58c-5480-4e1e-bc57-5a27066e9963","Type":"ContainerDied","Data":"7aaa9e670eba1533d7f59abe421f39726a806c55c2dafaba2040cade32c1d0a8"} Apr 21 02:47:11.129869 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:11.129837 2564 generic.go:358] "Generic (PLEG): container finished" podID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerID="edb07c88606d14d39a3311584e7e55d5adebaaa854f61d99cdca57cd4a25e123" exitCode=0 Apr 21 02:47:11.130313 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:11.129914 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" event={"ID":"4e3ef58c-5480-4e1e-bc57-5a27066e9963","Type":"ContainerDied","Data":"edb07c88606d14d39a3311584e7e55d5adebaaa854f61d99cdca57cd4a25e123"} Apr 21 02:47:12.248966 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.248946 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:12.367548 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.367521 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-bundle\") pod \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " Apr 21 02:47:12.367548 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.367551 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-util\") pod \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " Apr 21 02:47:12.367705 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.367621 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdcq9\" (UniqueName: \"kubernetes.io/projected/4e3ef58c-5480-4e1e-bc57-5a27066e9963-kube-api-access-pdcq9\") pod \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\" (UID: \"4e3ef58c-5480-4e1e-bc57-5a27066e9963\") " Apr 21 02:47:12.368408 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.368366 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-bundle" (OuterVolumeSpecName: "bundle") pod "4e3ef58c-5480-4e1e-bc57-5a27066e9963" (UID: "4e3ef58c-5480-4e1e-bc57-5a27066e9963"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:47:12.369670 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.369644 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e3ef58c-5480-4e1e-bc57-5a27066e9963-kube-api-access-pdcq9" (OuterVolumeSpecName: "kube-api-access-pdcq9") pod "4e3ef58c-5480-4e1e-bc57-5a27066e9963" (UID: "4e3ef58c-5480-4e1e-bc57-5a27066e9963"). InnerVolumeSpecName "kube-api-access-pdcq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:47:12.373392 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.373367 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-util" (OuterVolumeSpecName: "util") pod "4e3ef58c-5480-4e1e-bc57-5a27066e9963" (UID: "4e3ef58c-5480-4e1e-bc57-5a27066e9963"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:47:12.468143 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.468088 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pdcq9\" (UniqueName: \"kubernetes.io/projected/4e3ef58c-5480-4e1e-bc57-5a27066e9963-kube-api-access-pdcq9\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:47:12.468143 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.468111 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:47:12.468143 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:12.468120 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3ef58c-5480-4e1e-bc57-5a27066e9963-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:47:13.140092 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:13.140066 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" Apr 21 02:47:13.140253 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:13.140058 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59zgqc" event={"ID":"4e3ef58c-5480-4e1e-bc57-5a27066e9963","Type":"ContainerDied","Data":"f57d8c1a2433a996f39448130db5adfb78ba8418267638120cf9d53237096b07"} Apr 21 02:47:13.140253 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:13.140177 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57d8c1a2433a996f39448130db5adfb78ba8418267638120cf9d53237096b07" Apr 21 02:47:17.742700 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.742663 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg"] Apr 21 02:47:17.743103 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.742981 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerName="extract" Apr 21 02:47:17.743103 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.742991 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerName="extract" Apr 21 02:47:17.743103 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.743000 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerName="util" Apr 21 02:47:17.743103 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.743005 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerName="util" Apr 21 02:47:17.743103 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.743012 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerName="pull" Apr 21 02:47:17.743103 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.743017 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerName="pull" Apr 21 02:47:17.743103 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.743079 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e3ef58c-5480-4e1e-bc57-5a27066e9963" containerName="extract" Apr 21 02:47:17.748939 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.748914 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:17.751414 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.751389 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 02:47:17.751586 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.751563 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 02:47:17.751688 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.751556 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ssj29\"" Apr 21 02:47:17.757038 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.756640 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg"] Apr 21 02:47:17.907145 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.907108 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:17.907290 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.907171 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw98j\" (UniqueName: \"kubernetes.io/projected/e0c3c831-4c03-4343-9204-dedb1220c06c-kube-api-access-jw98j\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:17.907290 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:17.907204 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:18.008486 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:18.008464 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:18.008616 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:18.008516 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw98j\" (UniqueName: \"kubernetes.io/projected/e0c3c831-4c03-4343-9204-dedb1220c06c-kube-api-access-jw98j\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:18.008616 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:18.008538 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:18.008806 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:18.008791 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:18.008872 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:18.008856 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:18.016924 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:18.016902 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw98j\" (UniqueName: \"kubernetes.io/projected/e0c3c831-4c03-4343-9204-dedb1220c06c-kube-api-access-jw98j\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:18.059863 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:18.059843 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:18.182329 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:18.182291 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg"] Apr 21 02:47:18.185942 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:47:18.185908 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0c3c831_4c03_4343_9204_dedb1220c06c.slice/crio-b0b127d05181a20f03e19e4d950ce06b2ca786e4b9ddbdc6819848f34d0bdfff WatchSource:0}: Error finding container b0b127d05181a20f03e19e4d950ce06b2ca786e4b9ddbdc6819848f34d0bdfff: Status 404 returned error can't find the container with id b0b127d05181a20f03e19e4d950ce06b2ca786e4b9ddbdc6819848f34d0bdfff Apr 21 02:47:19.006605 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.006551 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9"] Apr 21 02:47:19.012417 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.012390 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.017598 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.017445 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 02:47:19.017768 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.017724 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 02:47:19.017768 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.017724 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5js6v\"" Apr 21 02:47:19.017913 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.017892 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 02:47:19.028085 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.028060 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9"] Apr 21 02:47:19.119074 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.119040 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/69f41614-cb4c-43c1-86b6-431d8ffb9de8-manager-config\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.119074 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.119077 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69f41614-cb4c-43c1-86b6-431d8ffb9de8-cert\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.119327 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.119099 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/69f41614-cb4c-43c1-86b6-431d8ffb9de8-metrics-cert\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.119327 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.119171 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgm65\" (UniqueName: \"kubernetes.io/projected/69f41614-cb4c-43c1-86b6-431d8ffb9de8-kube-api-access-fgm65\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.164713 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.164681 2564 generic.go:358] "Generic (PLEG): container finished" podID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerID="a16b3355ebbce8cb45b9424a977aac3754670cbd0c4cca37b8d3c1c7e2659f7e" exitCode=0 Apr 21 02:47:19.164857 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.164759 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" event={"ID":"e0c3c831-4c03-4343-9204-dedb1220c06c","Type":"ContainerDied","Data":"a16b3355ebbce8cb45b9424a977aac3754670cbd0c4cca37b8d3c1c7e2659f7e"} Apr 21 02:47:19.164857 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.164791 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" event={"ID":"e0c3c831-4c03-4343-9204-dedb1220c06c","Type":"ContainerStarted","Data":"b0b127d05181a20f03e19e4d950ce06b2ca786e4b9ddbdc6819848f34d0bdfff"} Apr 21 02:47:19.220326 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.220300 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/69f41614-cb4c-43c1-86b6-431d8ffb9de8-manager-config\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.220447 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.220334 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69f41614-cb4c-43c1-86b6-431d8ffb9de8-cert\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.220447 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.220354 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/69f41614-cb4c-43c1-86b6-431d8ffb9de8-metrics-cert\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.220578 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.220457 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgm65\" (UniqueName: \"kubernetes.io/projected/69f41614-cb4c-43c1-86b6-431d8ffb9de8-kube-api-access-fgm65\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.221018 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.220991 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/69f41614-cb4c-43c1-86b6-431d8ffb9de8-manager-config\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.223019 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.222996 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69f41614-cb4c-43c1-86b6-431d8ffb9de8-cert\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.223395 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.223375 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/69f41614-cb4c-43c1-86b6-431d8ffb9de8-metrics-cert\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.236398 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.236370 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgm65\" (UniqueName: \"kubernetes.io/projected/69f41614-cb4c-43c1-86b6-431d8ffb9de8-kube-api-access-fgm65\") pod \"lws-controller-manager-64dc57f969-r8kj9\" (UID: \"69f41614-cb4c-43c1-86b6-431d8ffb9de8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.338338 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.338274 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:19.475962 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.475934 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9"] Apr 21 02:47:19.477387 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:47:19.477358 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f41614_cb4c_43c1_86b6_431d8ffb9de8.slice/crio-0264720825f11659a7bd0c700296c7217eb0c9421035fbee1d85d1ff4ec00020 WatchSource:0}: Error finding container 0264720825f11659a7bd0c700296c7217eb0c9421035fbee1d85d1ff4ec00020: Status 404 returned error can't find the container with id 0264720825f11659a7bd0c700296c7217eb0c9421035fbee1d85d1ff4ec00020 Apr 21 02:47:19.931661 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.931632 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc"] Apr 21 02:47:19.937212 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.937190 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:19.940269 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.940247 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 02:47:19.940416 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.940350 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pmqhb\"" Apr 21 02:47:19.940940 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.940916 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 02:47:19.941087 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.941066 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 02:47:19.941277 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.941264 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 02:47:19.951631 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:19.951608 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc"] Apr 21 02:47:20.027489 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.027455 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmzwt\" (UniqueName: \"kubernetes.io/projected/d3385a01-1bf2-49ca-b232-f651e0f598f1-kube-api-access-fmzwt\") pod \"opendatahub-operator-controller-manager-5f4d6bff-5ppcc\" (UID: \"d3385a01-1bf2-49ca-b232-f651e0f598f1\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.027903 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.027532 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3385a01-1bf2-49ca-b232-f651e0f598f1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-5ppcc\" (UID: \"d3385a01-1bf2-49ca-b232-f651e0f598f1\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.027903 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.027650 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3385a01-1bf2-49ca-b232-f651e0f598f1-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-5ppcc\" (UID: \"d3385a01-1bf2-49ca-b232-f651e0f598f1\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.128469 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.128388 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmzwt\" (UniqueName: \"kubernetes.io/projected/d3385a01-1bf2-49ca-b232-f651e0f598f1-kube-api-access-fmzwt\") pod \"opendatahub-operator-controller-manager-5f4d6bff-5ppcc\" (UID: \"d3385a01-1bf2-49ca-b232-f651e0f598f1\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.128469 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.128434 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3385a01-1bf2-49ca-b232-f651e0f598f1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-5ppcc\" (UID: \"d3385a01-1bf2-49ca-b232-f651e0f598f1\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.128703 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.128472 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3385a01-1bf2-49ca-b232-f651e0f598f1-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-5ppcc\" (UID: \"d3385a01-1bf2-49ca-b232-f651e0f598f1\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.131571 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.131543 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3385a01-1bf2-49ca-b232-f651e0f598f1-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-5ppcc\" (UID: \"d3385a01-1bf2-49ca-b232-f651e0f598f1\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.131671 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.131584 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3385a01-1bf2-49ca-b232-f651e0f598f1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-5ppcc\" (UID: \"d3385a01-1bf2-49ca-b232-f651e0f598f1\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.139481 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.139459 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmzwt\" (UniqueName: \"kubernetes.io/projected/d3385a01-1bf2-49ca-b232-f651e0f598f1-kube-api-access-fmzwt\") pod \"opendatahub-operator-controller-manager-5f4d6bff-5ppcc\" (UID: \"d3385a01-1bf2-49ca-b232-f651e0f598f1\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.169913 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.169880 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" event={"ID":"69f41614-cb4c-43c1-86b6-431d8ffb9de8","Type":"ContainerStarted","Data":"0264720825f11659a7bd0c700296c7217eb0c9421035fbee1d85d1ff4ec00020"} Apr 21 02:47:20.171910 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.171882 2564 generic.go:358] "Generic (PLEG): container finished" podID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerID="979a71f402d04374f53b8a98e372df529a8d95355ca9e6f90c75509bef409cc5" exitCode=0 Apr 21 02:47:20.172034 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.171917 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" event={"ID":"e0c3c831-4c03-4343-9204-dedb1220c06c","Type":"ContainerDied","Data":"979a71f402d04374f53b8a98e372df529a8d95355ca9e6f90c75509bef409cc5"} Apr 21 02:47:20.290340 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.290309 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:20.439700 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:20.439671 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc"] Apr 21 02:47:20.442189 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:47:20.442155 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3385a01_1bf2_49ca_b232_f651e0f598f1.slice/crio-a7ba783fab5821ee6717c6471525b984620b1ed1a95ef3a42897d1a3390b4e5b WatchSource:0}: Error finding container a7ba783fab5821ee6717c6471525b984620b1ed1a95ef3a42897d1a3390b4e5b: Status 404 returned error can't find the container with id a7ba783fab5821ee6717c6471525b984620b1ed1a95ef3a42897d1a3390b4e5b Apr 21 02:47:21.181299 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:21.181108 2564 generic.go:358] "Generic (PLEG): container finished" podID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerID="167854ff3b8a714e83f12d7b801403524129b6372021ae5b3fb9c32c04955fb9" exitCode=0 Apr 21 02:47:21.181299 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:21.181246 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" event={"ID":"e0c3c831-4c03-4343-9204-dedb1220c06c","Type":"ContainerDied","Data":"167854ff3b8a714e83f12d7b801403524129b6372021ae5b3fb9c32c04955fb9"} Apr 21 02:47:21.182707 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:21.182578 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" event={"ID":"d3385a01-1bf2-49ca-b232-f651e0f598f1","Type":"ContainerStarted","Data":"a7ba783fab5821ee6717c6471525b984620b1ed1a95ef3a42897d1a3390b4e5b"} Apr 21 02:47:21.185393 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:21.185368 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" event={"ID":"69f41614-cb4c-43c1-86b6-431d8ffb9de8","Type":"ContainerStarted","Data":"dc4758aa6d73bcef14686a1f8af9e4024780a747cd1f9f9e9a5b52054feb281e"} Apr 21 02:47:21.185947 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:21.185920 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:21.236022 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:21.234940 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" podStartSLOduration=1.631664141 podStartE2EDuration="3.234922591s" podCreationTimestamp="2026-04-21 02:47:18 +0000 UTC" firstStartedPulling="2026-04-21 02:47:19.479251485 +0000 UTC m=+372.246740152" lastFinishedPulling="2026-04-21 02:47:21.082509921 +0000 UTC m=+373.849998602" observedRunningTime="2026-04-21 02:47:21.233035538 +0000 UTC m=+374.000524228" watchObservedRunningTime="2026-04-21 02:47:21.234922591 +0000 UTC m=+374.002411279" Apr 21 02:47:23.146300 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.146272 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:23.194755 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.194736 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" Apr 21 02:47:23.194873 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.194792 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9z89hg" event={"ID":"e0c3c831-4c03-4343-9204-dedb1220c06c","Type":"ContainerDied","Data":"b0b127d05181a20f03e19e4d950ce06b2ca786e4b9ddbdc6819848f34d0bdfff"} Apr 21 02:47:23.194873 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.194818 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b127d05181a20f03e19e4d950ce06b2ca786e4b9ddbdc6819848f34d0bdfff" Apr 21 02:47:23.262276 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.262247 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-bundle\") pod \"e0c3c831-4c03-4343-9204-dedb1220c06c\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " Apr 21 02:47:23.262382 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.262356 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw98j\" (UniqueName: \"kubernetes.io/projected/e0c3c831-4c03-4343-9204-dedb1220c06c-kube-api-access-jw98j\") pod \"e0c3c831-4c03-4343-9204-dedb1220c06c\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " Apr 21 02:47:23.262445 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.262394 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-util\") pod \"e0c3c831-4c03-4343-9204-dedb1220c06c\" (UID: \"e0c3c831-4c03-4343-9204-dedb1220c06c\") " Apr 21 02:47:23.263111 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.263081 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-bundle" (OuterVolumeSpecName: "bundle") pod "e0c3c831-4c03-4343-9204-dedb1220c06c" (UID: "e0c3c831-4c03-4343-9204-dedb1220c06c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:47:23.264964 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.264900 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c3c831-4c03-4343-9204-dedb1220c06c-kube-api-access-jw98j" (OuterVolumeSpecName: "kube-api-access-jw98j") pod "e0c3c831-4c03-4343-9204-dedb1220c06c" (UID: "e0c3c831-4c03-4343-9204-dedb1220c06c"). InnerVolumeSpecName "kube-api-access-jw98j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:47:23.269190 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.269167 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-util" (OuterVolumeSpecName: "util") pod "e0c3c831-4c03-4343-9204-dedb1220c06c" (UID: "e0c3c831-4c03-4343-9204-dedb1220c06c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:47:23.363545 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.363515 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jw98j\" (UniqueName: \"kubernetes.io/projected/e0c3c831-4c03-4343-9204-dedb1220c06c-kube-api-access-jw98j\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:47:23.363545 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.363543 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:47:23.363545 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:23.363552 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0c3c831-4c03-4343-9204-dedb1220c06c-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:47:24.199200 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:24.199160 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" event={"ID":"d3385a01-1bf2-49ca-b232-f651e0f598f1","Type":"ContainerStarted","Data":"92f5b09f370499ecfb688802e2902b3c7f7ef37fbfcd1bbbd2e4743bcf1be926"} Apr 21 02:47:24.199564 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:24.199393 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:24.217002 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:24.216952 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" podStartSLOduration=2.473400409 podStartE2EDuration="5.216936968s" podCreationTimestamp="2026-04-21 02:47:19 +0000 UTC" firstStartedPulling="2026-04-21 02:47:20.444073835 +0000 UTC m=+373.211562507" lastFinishedPulling="2026-04-21 02:47:23.187610398 +0000 UTC m=+375.955099066" observedRunningTime="2026-04-21 02:47:24.216698962 +0000 UTC m=+376.984187652" watchObservedRunningTime="2026-04-21 02:47:24.216936968 +0000 UTC m=+376.984425658" Apr 21 02:47:33.197641 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:33.197610 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-r8kj9" Apr 21 02:47:35.204224 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:35.204194 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-5ppcc" Apr 21 02:47:47.497059 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.497020 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp"] Apr 21 02:47:47.497593 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.497574 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerName="pull" Apr 21 02:47:47.497677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.497595 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerName="pull" Apr 21 02:47:47.497677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.497607 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerName="extract" Apr 21 02:47:47.497677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.497615 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerName="extract" Apr 21 02:47:47.497677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.497631 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerName="util" Apr 21 02:47:47.497677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.497639 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerName="util" Apr 21 02:47:47.497925 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.497736 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0c3c831-4c03-4343-9204-dedb1220c06c" containerName="extract" Apr 21 02:47:47.502517 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.502475 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.504859 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.504840 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 02:47:47.504965 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.504882 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ssj29\"" Apr 21 02:47:47.505785 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.505764 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 02:47:47.508070 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.508048 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp"] Apr 21 02:47:47.534349 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.534328 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.534446 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.534370 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.534446 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.534395 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbhj\" (UniqueName: \"kubernetes.io/projected/a2356717-d1a5-4490-a69a-695ba1b5c405-kube-api-access-4dbhj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.635090 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.635066 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.635174 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.635104 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbhj\" (UniqueName: \"kubernetes.io/projected/a2356717-d1a5-4490-a69a-695ba1b5c405-kube-api-access-4dbhj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.635174 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.635158 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.635412 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.635395 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.635513 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.635479 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.643819 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.643802 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbhj\" (UniqueName: \"kubernetes.io/projected/a2356717-d1a5-4490-a69a-695ba1b5c405-kube-api-access-4dbhj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.813216 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.813197 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:47.936465 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:47.936432 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp"] Apr 21 02:47:47.939915 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:47:47.939881 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2356717_d1a5_4490_a69a_695ba1b5c405.slice/crio-2c055cb2ced9a871395fd3ed74e61fc36fa4d79c70a3b6252bd8b230695915dc WatchSource:0}: Error finding container 2c055cb2ced9a871395fd3ed74e61fc36fa4d79c70a3b6252bd8b230695915dc: Status 404 returned error can't find the container with id 2c055cb2ced9a871395fd3ed74e61fc36fa4d79c70a3b6252bd8b230695915dc Apr 21 02:47:48.279784 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:48.279748 2564 generic.go:358] "Generic (PLEG): container finished" podID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerID="18b88cf03f69a0d0b33303c5b12978ecd4ea3d58e6c5f872f4387d6be3d45ca2" exitCode=0 Apr 21 02:47:48.279944 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:48.279785 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" event={"ID":"a2356717-d1a5-4490-a69a-695ba1b5c405","Type":"ContainerDied","Data":"18b88cf03f69a0d0b33303c5b12978ecd4ea3d58e6c5f872f4387d6be3d45ca2"} Apr 21 02:47:48.279944 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:48.279828 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" event={"ID":"a2356717-d1a5-4490-a69a-695ba1b5c405","Type":"ContainerStarted","Data":"2c055cb2ced9a871395fd3ed74e61fc36fa4d79c70a3b6252bd8b230695915dc"} Apr 21 02:47:49.284767 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:49.284701 2564 generic.go:358] "Generic (PLEG): container finished" podID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerID="1bfb6791369e8f3fa62f1fa712b9996b8e089194384dc57f29af26cf875974c1" exitCode=0 Apr 21 02:47:49.284767 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:49.284737 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" event={"ID":"a2356717-d1a5-4490-a69a-695ba1b5c405","Type":"ContainerDied","Data":"1bfb6791369e8f3fa62f1fa712b9996b8e089194384dc57f29af26cf875974c1"} Apr 21 02:47:50.289374 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:50.289342 2564 generic.go:358] "Generic (PLEG): container finished" podID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerID="bdc2725adbdd675c7de49bb4f2960af00999f0d0884fa788ac6d3d6e8a20c56c" exitCode=0 Apr 21 02:47:50.289755 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:50.289379 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" event={"ID":"a2356717-d1a5-4490-a69a-695ba1b5c405","Type":"ContainerDied","Data":"bdc2725adbdd675c7de49bb4f2960af00999f0d0884fa788ac6d3d6e8a20c56c"} Apr 21 02:47:51.413219 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.413198 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:51.463152 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.463124 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dbhj\" (UniqueName: \"kubernetes.io/projected/a2356717-d1a5-4490-a69a-695ba1b5c405-kube-api-access-4dbhj\") pod \"a2356717-d1a5-4490-a69a-695ba1b5c405\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " Apr 21 02:47:51.463274 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.463163 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-util\") pod \"a2356717-d1a5-4490-a69a-695ba1b5c405\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " Apr 21 02:47:51.463274 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.463229 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-bundle\") pod \"a2356717-d1a5-4490-a69a-695ba1b5c405\" (UID: \"a2356717-d1a5-4490-a69a-695ba1b5c405\") " Apr 21 02:47:51.464134 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.464105 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-bundle" (OuterVolumeSpecName: "bundle") pod "a2356717-d1a5-4490-a69a-695ba1b5c405" (UID: "a2356717-d1a5-4490-a69a-695ba1b5c405"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:47:51.465112 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.465091 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2356717-d1a5-4490-a69a-695ba1b5c405-kube-api-access-4dbhj" (OuterVolumeSpecName: "kube-api-access-4dbhj") pod "a2356717-d1a5-4490-a69a-695ba1b5c405" (UID: "a2356717-d1a5-4490-a69a-695ba1b5c405"). InnerVolumeSpecName "kube-api-access-4dbhj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:47:51.471381 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.471356 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-util" (OuterVolumeSpecName: "util") pod "a2356717-d1a5-4490-a69a-695ba1b5c405" (UID: "a2356717-d1a5-4490-a69a-695ba1b5c405"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:47:51.564606 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.564550 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:47:51.564606 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.564570 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4dbhj\" (UniqueName: \"kubernetes.io/projected/a2356717-d1a5-4490-a69a-695ba1b5c405-kube-api-access-4dbhj\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:47:51.564606 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:51.564581 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2356717-d1a5-4490-a69a-695ba1b5c405-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:47:52.297859 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:52.297823 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" event={"ID":"a2356717-d1a5-4490-a69a-695ba1b5c405","Type":"ContainerDied","Data":"2c055cb2ced9a871395fd3ed74e61fc36fa4d79c70a3b6252bd8b230695915dc"} Apr 21 02:47:52.297859 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:52.297856 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c055cb2ced9a871395fd3ed74e61fc36fa4d79c70a3b6252bd8b230695915dc" Apr 21 02:47:52.298096 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:52.297876 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835r7sfp" Apr 21 02:47:56.645198 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.645158 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4"] Apr 21 02:47:56.645945 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.645913 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerName="extract" Apr 21 02:47:56.645945 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.645943 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerName="extract" Apr 21 02:47:56.646097 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.645956 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerName="util" Apr 21 02:47:56.646097 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.645965 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerName="util" Apr 21 02:47:56.646097 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.645993 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerName="pull" Apr 21 02:47:56.646097 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.646002 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerName="pull" Apr 21 02:47:56.646282 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.646158 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2356717-d1a5-4490-a69a-695ba1b5c405" containerName="extract" Apr 21 02:47:56.654412 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.654391 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.657022 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.656999 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 02:47:56.657898 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.657881 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 02:47:56.657986 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.657884 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ssj29\"" Apr 21 02:47:56.663548 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.663526 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4"] Apr 21 02:47:56.701582 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.701558 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjdp5\" (UniqueName: \"kubernetes.io/projected/1acf7fde-83cf-4969-b786-71dd69951858-kube-api-access-jjdp5\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.701692 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.701595 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.701692 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.701632 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.802877 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.802851 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjdp5\" (UniqueName: \"kubernetes.io/projected/1acf7fde-83cf-4969-b786-71dd69951858-kube-api-access-jjdp5\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.803012 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.802886 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.803012 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.802914 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.803230 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.803206 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.803291 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.803243 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.816736 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.816714 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjdp5\" (UniqueName: \"kubernetes.io/projected/1acf7fde-83cf-4969-b786-71dd69951858-kube-api-access-jjdp5\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:56.964067 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:56.963994 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:47:57.308297 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:57.308269 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4"] Apr 21 02:47:57.309557 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:47:57.309528 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1acf7fde_83cf_4969_b786_71dd69951858.slice/crio-d28106cbc53d263175cc0f9534ae0f8f1f2ef7d712ea557dd24f19574ff801c9 WatchSource:0}: Error finding container d28106cbc53d263175cc0f9534ae0f8f1f2ef7d712ea557dd24f19574ff801c9: Status 404 returned error can't find the container with id d28106cbc53d263175cc0f9534ae0f8f1f2ef7d712ea557dd24f19574ff801c9 Apr 21 02:47:57.314905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:57.314878 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" event={"ID":"1acf7fde-83cf-4969-b786-71dd69951858","Type":"ContainerStarted","Data":"d28106cbc53d263175cc0f9534ae0f8f1f2ef7d712ea557dd24f19574ff801c9"} Apr 21 02:47:58.319691 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:58.319657 2564 generic.go:358] "Generic (PLEG): container finished" podID="1acf7fde-83cf-4969-b786-71dd69951858" containerID="b4d8d1297ba01e49183f99d3751ae8d4f2cc95bab688be9c4e0d1cbd4395052b" exitCode=0 Apr 21 02:47:58.320047 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:58.319723 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" event={"ID":"1acf7fde-83cf-4969-b786-71dd69951858","Type":"ContainerDied","Data":"b4d8d1297ba01e49183f99d3751ae8d4f2cc95bab688be9c4e0d1cbd4395052b"} Apr 21 02:47:59.324774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:59.324742 2564 generic.go:358] "Generic (PLEG): container finished" podID="1acf7fde-83cf-4969-b786-71dd69951858" containerID="7ca5f8bcb45c89a59cf1d93cad2c4d41302df4ee2fa7321e77351f3586eeaee3" exitCode=0 Apr 21 02:47:59.325114 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:47:59.324821 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" event={"ID":"1acf7fde-83cf-4969-b786-71dd69951858","Type":"ContainerDied","Data":"7ca5f8bcb45c89a59cf1d93cad2c4d41302df4ee2fa7321e77351f3586eeaee3"} Apr 21 02:48:00.329802 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:00.329768 2564 generic.go:358] "Generic (PLEG): container finished" podID="1acf7fde-83cf-4969-b786-71dd69951858" containerID="9fc72877e00a9fd512783ec9b23812d315771fe332453edae34a2b97617665aa" exitCode=0 Apr 21 02:48:00.330163 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:00.329806 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" event={"ID":"1acf7fde-83cf-4969-b786-71dd69951858","Type":"ContainerDied","Data":"9fc72877e00a9fd512783ec9b23812d315771fe332453edae34a2b97617665aa"} Apr 21 02:48:01.456132 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.456110 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:48:01.540730 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.540703 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-util\") pod \"1acf7fde-83cf-4969-b786-71dd69951858\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " Apr 21 02:48:01.540850 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.540748 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-bundle\") pod \"1acf7fde-83cf-4969-b786-71dd69951858\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " Apr 21 02:48:01.540850 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.540800 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjdp5\" (UniqueName: \"kubernetes.io/projected/1acf7fde-83cf-4969-b786-71dd69951858-kube-api-access-jjdp5\") pod \"1acf7fde-83cf-4969-b786-71dd69951858\" (UID: \"1acf7fde-83cf-4969-b786-71dd69951858\") " Apr 21 02:48:01.541609 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.541586 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-bundle" (OuterVolumeSpecName: "bundle") pod "1acf7fde-83cf-4969-b786-71dd69951858" (UID: "1acf7fde-83cf-4969-b786-71dd69951858"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:48:01.542866 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.542837 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1acf7fde-83cf-4969-b786-71dd69951858-kube-api-access-jjdp5" (OuterVolumeSpecName: "kube-api-access-jjdp5") pod "1acf7fde-83cf-4969-b786-71dd69951858" (UID: "1acf7fde-83cf-4969-b786-71dd69951858"). InnerVolumeSpecName "kube-api-access-jjdp5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:48:01.546422 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.546386 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-util" (OuterVolumeSpecName: "util") pod "1acf7fde-83cf-4969-b786-71dd69951858" (UID: "1acf7fde-83cf-4969-b786-71dd69951858"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:48:01.642099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.642042 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:01.642099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.642069 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1acf7fde-83cf-4969-b786-71dd69951858-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:01.642099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:01.642079 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jjdp5\" (UniqueName: \"kubernetes.io/projected/1acf7fde-83cf-4969-b786-71dd69951858-kube-api-access-jjdp5\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:02.338643 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:02.338615 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" Apr 21 02:48:02.338643 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:02.338626 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2m8xq4" event={"ID":"1acf7fde-83cf-4969-b786-71dd69951858","Type":"ContainerDied","Data":"d28106cbc53d263175cc0f9534ae0f8f1f2ef7d712ea557dd24f19574ff801c9"} Apr 21 02:48:02.338823 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:02.338652 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d28106cbc53d263175cc0f9534ae0f8f1f2ef7d712ea557dd24f19574ff801c9" Apr 21 02:48:26.996910 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:26.996818 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh"] Apr 21 02:48:26.997360 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:26.997317 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1acf7fde-83cf-4969-b786-71dd69951858" containerName="util" Apr 21 02:48:26.997360 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:26.997337 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acf7fde-83cf-4969-b786-71dd69951858" containerName="util" Apr 21 02:48:26.997483 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:26.997365 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1acf7fde-83cf-4969-b786-71dd69951858" containerName="pull" Apr 21 02:48:26.997483 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:26.997374 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acf7fde-83cf-4969-b786-71dd69951858" containerName="pull" Apr 21 02:48:26.997483 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:26.997386 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1acf7fde-83cf-4969-b786-71dd69951858" containerName="extract" Apr 21 02:48:26.997483 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:26.997395 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acf7fde-83cf-4969-b786-71dd69951858" containerName="extract" Apr 21 02:48:26.997723 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:26.997520 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1acf7fde-83cf-4969-b786-71dd69951858" containerName="extract" Apr 21 02:48:27.000792 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.000767 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.003376 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.003355 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 02:48:27.003480 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.003409 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 02:48:27.003480 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.003355 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 02:48:27.003480 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.003355 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-smvh2\"" Apr 21 02:48:27.011752 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.011731 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh"] Apr 21 02:48:27.128903 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.128872 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/57b32747-a408-4d12-b996-fc025af345f9-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.129030 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.128912 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.129030 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.128977 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/57b32747-a408-4d12-b996-fc025af345f9-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.129139 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.129035 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.129139 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.129061 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.129139 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.129124 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtcdf\" (UniqueName: \"kubernetes.io/projected/57b32747-a408-4d12-b996-fc025af345f9-kube-api-access-rtcdf\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.129256 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.129183 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.129256 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.129208 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.129256 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.129242 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/57b32747-a408-4d12-b996-fc025af345f9-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230034 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230010 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230162 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230062 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtcdf\" (UniqueName: \"kubernetes.io/projected/57b32747-a408-4d12-b996-fc025af345f9-kube-api-access-rtcdf\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230162 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230095 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230162 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230118 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230162 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230146 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/57b32747-a408-4d12-b996-fc025af345f9-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230344 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230169 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/57b32747-a408-4d12-b996-fc025af345f9-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230344 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230195 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230344 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230239 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/57b32747-a408-4d12-b996-fc025af345f9-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230344 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230264 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230582 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230427 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230582 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230458 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230686 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230630 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230856 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230837 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.230924 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.230899 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/57b32747-a408-4d12-b996-fc025af345f9-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.232543 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.232491 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/57b32747-a408-4d12-b996-fc025af345f9-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.232670 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.232653 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/57b32747-a408-4d12-b996-fc025af345f9-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.238042 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.238020 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/57b32747-a408-4d12-b996-fc025af345f9-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.238042 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.238037 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtcdf\" (UniqueName: \"kubernetes.io/projected/57b32747-a408-4d12-b996-fc025af345f9-kube-api-access-rtcdf\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fgw7wh\" (UID: \"57b32747-a408-4d12-b996-fc025af345f9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.312296 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.312269 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:27.438223 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:27.437673 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh"] Apr 21 02:48:27.440533 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:48:27.440482 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b32747_a408_4d12_b996_fc025af345f9.slice/crio-bc221a469b8f3ff3d61c5756be20dda4988157b42e57d513f03929f9d5eb67c7 WatchSource:0}: Error finding container bc221a469b8f3ff3d61c5756be20dda4988157b42e57d513f03929f9d5eb67c7: Status 404 returned error can't find the container with id bc221a469b8f3ff3d61c5756be20dda4988157b42e57d513f03929f9d5eb67c7 Apr 21 02:48:28.438290 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:28.438239 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" event={"ID":"57b32747-a408-4d12-b996-fc025af345f9","Type":"ContainerStarted","Data":"bc221a469b8f3ff3d61c5756be20dda4988157b42e57d513f03929f9d5eb67c7"} Apr 21 02:48:29.767233 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:29.767202 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 02:48:29.767462 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:29.767271 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 02:48:29.767462 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:29.767296 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 02:48:30.447030 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:30.446997 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" event={"ID":"57b32747-a408-4d12-b996-fc025af345f9","Type":"ContainerStarted","Data":"6a6208597cc07526c8aaf00a8ed93d04bd7f2c66fce1fb8f2e692d5e79b14853"} Apr 21 02:48:30.468171 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:30.468126 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" podStartSLOduration=2.143208278 podStartE2EDuration="4.468112413s" podCreationTimestamp="2026-04-21 02:48:26 +0000 UTC" firstStartedPulling="2026-04-21 02:48:27.44208626 +0000 UTC m=+440.209574926" lastFinishedPulling="2026-04-21 02:48:29.766990394 +0000 UTC m=+442.534479061" observedRunningTime="2026-04-21 02:48:30.46545968 +0000 UTC m=+443.232948370" watchObservedRunningTime="2026-04-21 02:48:30.468112413 +0000 UTC m=+443.235601101" Apr 21 02:48:31.313253 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:31.313226 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:31.317405 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:31.317383 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:31.456793 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:31.456768 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:31.457556 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:31.457537 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fgw7wh" Apr 21 02:48:38.275619 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.275585 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ggwbz"] Apr 21 02:48:38.282094 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.282068 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" Apr 21 02:48:38.284749 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.284721 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 02:48:38.285902 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.285879 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 02:48:38.286089 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.285879 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-bjq7n\"" Apr 21 02:48:38.287643 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.287622 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ggwbz"] Apr 21 02:48:38.414436 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.414407 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l66fq\" (UniqueName: \"kubernetes.io/projected/7cfd54ec-73ad-44aa-a139-e506dad25c85-kube-api-access-l66fq\") pod \"kuadrant-operator-catalog-ggwbz\" (UID: \"7cfd54ec-73ad-44aa-a139-e506dad25c85\") " pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" Apr 21 02:48:38.515114 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.515091 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l66fq\" (UniqueName: \"kubernetes.io/projected/7cfd54ec-73ad-44aa-a139-e506dad25c85-kube-api-access-l66fq\") pod \"kuadrant-operator-catalog-ggwbz\" (UID: \"7cfd54ec-73ad-44aa-a139-e506dad25c85\") " pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" Apr 21 02:48:38.522636 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.522616 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l66fq\" (UniqueName: \"kubernetes.io/projected/7cfd54ec-73ad-44aa-a139-e506dad25c85-kube-api-access-l66fq\") pod \"kuadrant-operator-catalog-ggwbz\" (UID: \"7cfd54ec-73ad-44aa-a139-e506dad25c85\") " pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" Apr 21 02:48:38.594675 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.594610 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" Apr 21 02:48:38.649808 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.649777 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ggwbz"] Apr 21 02:48:38.729786 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.729764 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ggwbz"] Apr 21 02:48:38.731640 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:48:38.731615 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cfd54ec_73ad_44aa_a139_e506dad25c85.slice/crio-9707c8420ec232771c94328be3dc2c5271f9b7eb88362dc4c8d0e70c5ee59a78 WatchSource:0}: Error finding container 9707c8420ec232771c94328be3dc2c5271f9b7eb88362dc4c8d0e70c5ee59a78: Status 404 returned error can't find the container with id 9707c8420ec232771c94328be3dc2c5271f9b7eb88362dc4c8d0e70c5ee59a78 Apr 21 02:48:38.854955 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.854894 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-29vgg"] Apr 21 02:48:38.859594 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.859576 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-29vgg" Apr 21 02:48:38.864270 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:38.864251 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-29vgg"] Apr 21 02:48:39.019476 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:39.019450 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5phx9\" (UniqueName: \"kubernetes.io/projected/f4891c25-20cb-4575-b7ba-5f7b41c2f140-kube-api-access-5phx9\") pod \"kuadrant-operator-catalog-29vgg\" (UID: \"f4891c25-20cb-4575-b7ba-5f7b41c2f140\") " pod="kuadrant-system/kuadrant-operator-catalog-29vgg" Apr 21 02:48:39.120551 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:39.120459 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5phx9\" (UniqueName: \"kubernetes.io/projected/f4891c25-20cb-4575-b7ba-5f7b41c2f140-kube-api-access-5phx9\") pod \"kuadrant-operator-catalog-29vgg\" (UID: \"f4891c25-20cb-4575-b7ba-5f7b41c2f140\") " pod="kuadrant-system/kuadrant-operator-catalog-29vgg" Apr 21 02:48:39.128014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:39.127997 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5phx9\" (UniqueName: \"kubernetes.io/projected/f4891c25-20cb-4575-b7ba-5f7b41c2f140-kube-api-access-5phx9\") pod \"kuadrant-operator-catalog-29vgg\" (UID: \"f4891c25-20cb-4575-b7ba-5f7b41c2f140\") " pod="kuadrant-system/kuadrant-operator-catalog-29vgg" Apr 21 02:48:39.170743 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:39.170721 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-29vgg" Apr 21 02:48:39.354662 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:39.354636 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-29vgg"] Apr 21 02:48:39.361723 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:48:39.361692 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4891c25_20cb_4575_b7ba_5f7b41c2f140.slice/crio-d350b9d1f7b6b4ccbd18bea2edc9d10adf2cab1c81b0ec4f641a3cbb080d39b4 WatchSource:0}: Error finding container d350b9d1f7b6b4ccbd18bea2edc9d10adf2cab1c81b0ec4f641a3cbb080d39b4: Status 404 returned error can't find the container with id d350b9d1f7b6b4ccbd18bea2edc9d10adf2cab1c81b0ec4f641a3cbb080d39b4 Apr 21 02:48:39.484362 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:39.484287 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-29vgg" event={"ID":"f4891c25-20cb-4575-b7ba-5f7b41c2f140","Type":"ContainerStarted","Data":"d350b9d1f7b6b4ccbd18bea2edc9d10adf2cab1c81b0ec4f641a3cbb080d39b4"} Apr 21 02:48:39.485442 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:39.485418 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" event={"ID":"7cfd54ec-73ad-44aa-a139-e506dad25c85","Type":"ContainerStarted","Data":"9707c8420ec232771c94328be3dc2c5271f9b7eb88362dc4c8d0e70c5ee59a78"} Apr 21 02:48:41.495459 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:41.495415 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-29vgg" event={"ID":"f4891c25-20cb-4575-b7ba-5f7b41c2f140","Type":"ContainerStarted","Data":"b8c542addbb69f7ca298c00f2e24faf3dff3310aa9b3cabf7d87ab44f1d55c6e"} Apr 21 02:48:41.496888 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:41.496863 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" event={"ID":"7cfd54ec-73ad-44aa-a139-e506dad25c85","Type":"ContainerStarted","Data":"313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b"} Apr 21 02:48:41.497000 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:41.496935 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" podUID="7cfd54ec-73ad-44aa-a139-e506dad25c85" containerName="registry-server" containerID="cri-o://313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b" gracePeriod=2 Apr 21 02:48:41.510795 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:41.510719 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-29vgg" podStartSLOduration=1.981043499 podStartE2EDuration="3.510703951s" podCreationTimestamp="2026-04-21 02:48:38 +0000 UTC" firstStartedPulling="2026-04-21 02:48:39.363252394 +0000 UTC m=+452.130741078" lastFinishedPulling="2026-04-21 02:48:40.892912859 +0000 UTC m=+453.660401530" observedRunningTime="2026-04-21 02:48:41.509422101 +0000 UTC m=+454.276910789" watchObservedRunningTime="2026-04-21 02:48:41.510703951 +0000 UTC m=+454.278192641" Apr 21 02:48:41.522535 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:41.522470 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" podStartSLOduration=1.365688032 podStartE2EDuration="3.522458912s" podCreationTimestamp="2026-04-21 02:48:38 +0000 UTC" firstStartedPulling="2026-04-21 02:48:38.732975108 +0000 UTC m=+451.500463775" lastFinishedPulling="2026-04-21 02:48:40.889745982 +0000 UTC m=+453.657234655" observedRunningTime="2026-04-21 02:48:41.522064844 +0000 UTC m=+454.289553534" watchObservedRunningTime="2026-04-21 02:48:41.522458912 +0000 UTC m=+454.289947601" Apr 21 02:48:41.733627 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:41.733605 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" Apr 21 02:48:41.844433 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:41.844408 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l66fq\" (UniqueName: \"kubernetes.io/projected/7cfd54ec-73ad-44aa-a139-e506dad25c85-kube-api-access-l66fq\") pod \"7cfd54ec-73ad-44aa-a139-e506dad25c85\" (UID: \"7cfd54ec-73ad-44aa-a139-e506dad25c85\") " Apr 21 02:48:41.846654 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:41.846627 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cfd54ec-73ad-44aa-a139-e506dad25c85-kube-api-access-l66fq" (OuterVolumeSpecName: "kube-api-access-l66fq") pod "7cfd54ec-73ad-44aa-a139-e506dad25c85" (UID: "7cfd54ec-73ad-44aa-a139-e506dad25c85"). InnerVolumeSpecName "kube-api-access-l66fq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:48:41.948031 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:41.948001 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l66fq\" (UniqueName: \"kubernetes.io/projected/7cfd54ec-73ad-44aa-a139-e506dad25c85-kube-api-access-l66fq\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:42.501801 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:42.501764 2564 generic.go:358] "Generic (PLEG): container finished" podID="7cfd54ec-73ad-44aa-a139-e506dad25c85" containerID="313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b" exitCode=0 Apr 21 02:48:42.502241 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:42.501841 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" Apr 21 02:48:42.502241 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:42.501853 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" event={"ID":"7cfd54ec-73ad-44aa-a139-e506dad25c85","Type":"ContainerDied","Data":"313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b"} Apr 21 02:48:42.502241 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:42.501902 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ggwbz" event={"ID":"7cfd54ec-73ad-44aa-a139-e506dad25c85","Type":"ContainerDied","Data":"9707c8420ec232771c94328be3dc2c5271f9b7eb88362dc4c8d0e70c5ee59a78"} Apr 21 02:48:42.502241 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:42.501925 2564 scope.go:117] "RemoveContainer" containerID="313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b" Apr 21 02:48:42.510780 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:42.510764 2564 scope.go:117] "RemoveContainer" containerID="313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b" Apr 21 02:48:42.510999 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:48:42.510985 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b\": container with ID starting with 313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b not found: ID does not exist" containerID="313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b" Apr 21 02:48:42.511047 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:42.511006 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b"} err="failed to get container status \"313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b\": rpc error: code = NotFound desc = could not find container \"313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b\": container with ID starting with 313a1a33270f2c6c98f968dd75e27cdc2ff69a9ae704fb7a0c79f60d09882c8b not found: ID does not exist" Apr 21 02:48:42.522037 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:42.522018 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ggwbz"] Apr 21 02:48:42.526232 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:42.526212 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ggwbz"] Apr 21 02:48:43.845035 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:43.845000 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cfd54ec-73ad-44aa-a139-e506dad25c85" path="/var/lib/kubelet/pods/7cfd54ec-73ad-44aa-a139-e506dad25c85/volumes" Apr 21 02:48:49.171655 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:49.171620 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-29vgg" Apr 21 02:48:49.171655 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:49.171662 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-29vgg" Apr 21 02:48:49.192754 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:49.192732 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-29vgg" Apr 21 02:48:49.546995 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:49.546972 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-29vgg" Apr 21 02:48:53.890647 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:53.890601 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d"] Apr 21 02:48:53.891380 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:53.891358 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cfd54ec-73ad-44aa-a139-e506dad25c85" containerName="registry-server" Apr 21 02:48:53.891493 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:53.891383 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfd54ec-73ad-44aa-a139-e506dad25c85" containerName="registry-server" Apr 21 02:48:53.891605 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:53.891590 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cfd54ec-73ad-44aa-a139-e506dad25c85" containerName="registry-server" Apr 21 02:48:53.896320 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:53.896289 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:53.898671 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:53.898644 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d"] Apr 21 02:48:53.898896 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:53.898876 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9hwh5\"" Apr 21 02:48:54.032201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.032171 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjsk8\" (UniqueName: \"kubernetes.io/projected/6b9875cd-7a08-475a-aee0-931f0e0008f4-kube-api-access-wjsk8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.032335 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.032228 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.032335 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.032259 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.133622 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.133591 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjsk8\" (UniqueName: \"kubernetes.io/projected/6b9875cd-7a08-475a-aee0-931f0e0008f4-kube-api-access-wjsk8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.133743 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.133641 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.133743 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.133677 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.134058 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.134033 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.134097 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.134064 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.142104 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.142039 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjsk8\" (UniqueName: \"kubernetes.io/projected/6b9875cd-7a08-475a-aee0-931f0e0008f4-kube-api-access-wjsk8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.207676 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.207645 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:54.328954 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.328903 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d"] Apr 21 02:48:54.331913 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:48:54.331880 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9875cd_7a08_475a_aee0_931f0e0008f4.slice/crio-827f4445a05e41b0341e56f9104fcd9d924f790376c2d6daf75b0c2c82189ba4 WatchSource:0}: Error finding container 827f4445a05e41b0341e56f9104fcd9d924f790376c2d6daf75b0c2c82189ba4: Status 404 returned error can't find the container with id 827f4445a05e41b0341e56f9104fcd9d924f790376c2d6daf75b0c2c82189ba4 Apr 21 02:48:54.497205 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.497178 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw"] Apr 21 02:48:54.500861 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.500842 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.510402 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.510382 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw"] Apr 21 02:48:54.543929 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.543904 2564 generic.go:358] "Generic (PLEG): container finished" podID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerID="5adcba4d86e30e4edd4ce6ca6e797c62345f16d68d979fab15402724127a7b7f" exitCode=0 Apr 21 02:48:54.544045 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.543991 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" event={"ID":"6b9875cd-7a08-475a-aee0-931f0e0008f4","Type":"ContainerDied","Data":"5adcba4d86e30e4edd4ce6ca6e797c62345f16d68d979fab15402724127a7b7f"} Apr 21 02:48:54.544045 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.544028 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" event={"ID":"6b9875cd-7a08-475a-aee0-931f0e0008f4","Type":"ContainerStarted","Data":"827f4445a05e41b0341e56f9104fcd9d924f790376c2d6daf75b0c2c82189ba4"} Apr 21 02:48:54.637461 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.637436 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbvhh\" (UniqueName: \"kubernetes.io/projected/733ca859-ae1f-43e6-9b40-b2ff9828431a-kube-api-access-fbvhh\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.637588 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.637487 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.637652 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.637605 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.738625 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.738577 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.738625 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.738616 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.738744 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.738656 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbvhh\" (UniqueName: \"kubernetes.io/projected/733ca859-ae1f-43e6-9b40-b2ff9828431a-kube-api-access-fbvhh\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.738908 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.738892 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.738973 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.738957 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.749251 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.749234 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbvhh\" (UniqueName: \"kubernetes.io/projected/733ca859-ae1f-43e6-9b40-b2ff9828431a-kube-api-access-fbvhh\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.810941 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.810917 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:54.925800 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:54.925776 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw"] Apr 21 02:48:54.927371 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:48:54.927339 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod733ca859_ae1f_43e6_9b40_b2ff9828431a.slice/crio-3fb0102161e188380821566b56712b2152039e15717680e89845ea2262f93894 WatchSource:0}: Error finding container 3fb0102161e188380821566b56712b2152039e15717680e89845ea2262f93894: Status 404 returned error can't find the container with id 3fb0102161e188380821566b56712b2152039e15717680e89845ea2262f93894 Apr 21 02:48:55.087062 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.087036 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd"] Apr 21 02:48:55.091296 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.091278 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.099229 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.099202 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd"] Apr 21 02:48:55.243338 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.243309 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.243526 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.243357 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.243526 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.243418 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d22d\" (UniqueName: \"kubernetes.io/projected/e43cc539-5833-43d0-a560-b80fc0e4fa0c-kube-api-access-2d22d\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.344252 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.344224 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d22d\" (UniqueName: \"kubernetes.io/projected/e43cc539-5833-43d0-a560-b80fc0e4fa0c-kube-api-access-2d22d\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.344366 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.344295 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.344366 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.344327 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.344662 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.344643 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.344743 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.344665 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.353304 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.353279 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d22d\" (UniqueName: \"kubernetes.io/projected/e43cc539-5833-43d0-a560-b80fc0e4fa0c-kube-api-access-2d22d\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.401544 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.401518 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:48:55.489995 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.489967 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz"] Apr 21 02:48:55.495638 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.495617 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.500639 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.500588 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz"] Apr 21 02:48:55.548964 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.548890 2564 generic.go:358] "Generic (PLEG): container finished" podID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerID="97aa6e3076b862fb72d2d2b181f1f67f339e25d2c6d8d1e72478d485e8cf2ac8" exitCode=0 Apr 21 02:48:55.549080 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.548973 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" event={"ID":"733ca859-ae1f-43e6-9b40-b2ff9828431a","Type":"ContainerDied","Data":"97aa6e3076b862fb72d2d2b181f1f67f339e25d2c6d8d1e72478d485e8cf2ac8"} Apr 21 02:48:55.549080 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.549003 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" event={"ID":"733ca859-ae1f-43e6-9b40-b2ff9828431a","Type":"ContainerStarted","Data":"3fb0102161e188380821566b56712b2152039e15717680e89845ea2262f93894"} Apr 21 02:48:55.550918 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.550897 2564 generic.go:358] "Generic (PLEG): container finished" podID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerID="190e91da318bf5a7c0f03e4054438af2ac04e2d6935d6324f5af14c63c190f26" exitCode=0 Apr 21 02:48:55.551014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.550943 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" event={"ID":"6b9875cd-7a08-475a-aee0-931f0e0008f4","Type":"ContainerDied","Data":"190e91da318bf5a7c0f03e4054438af2ac04e2d6935d6324f5af14c63c190f26"} Apr 21 02:48:55.647344 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.647317 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.647460 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.647374 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.647642 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.647622 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t4kl\" (UniqueName: \"kubernetes.io/projected/050bdd0b-9b38-4465-a197-62966aab158c-kube-api-access-2t4kl\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.733630 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.733606 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd"] Apr 21 02:48:55.735313 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:48:55.735290 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode43cc539_5833_43d0_a560_b80fc0e4fa0c.slice/crio-3d004a5b784958eef3f1c8d53db2d531b3c95820b9978546cb40c9d07fc22458 WatchSource:0}: Error finding container 3d004a5b784958eef3f1c8d53db2d531b3c95820b9978546cb40c9d07fc22458: Status 404 returned error can't find the container with id 3d004a5b784958eef3f1c8d53db2d531b3c95820b9978546cb40c9d07fc22458 Apr 21 02:48:55.748334 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.748303 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.748401 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.748360 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.748463 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.748448 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t4kl\" (UniqueName: \"kubernetes.io/projected/050bdd0b-9b38-4465-a197-62966aab158c-kube-api-access-2t4kl\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.748663 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.748647 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.748703 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.748685 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.756066 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.756047 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t4kl\" (UniqueName: \"kubernetes.io/projected/050bdd0b-9b38-4465-a197-62966aab158c-kube-api-access-2t4kl\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:55.808575 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:55.808521 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:56.139550 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.139525 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz"] Apr 21 02:48:56.175340 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:48:56.175308 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod050bdd0b_9b38_4465_a197_62966aab158c.slice/crio-9eefea847a1d8bf2025ab2091fd8d84b2e58a618bf725f94d13c307440eb34db WatchSource:0}: Error finding container 9eefea847a1d8bf2025ab2091fd8d84b2e58a618bf725f94d13c307440eb34db: Status 404 returned error can't find the container with id 9eefea847a1d8bf2025ab2091fd8d84b2e58a618bf725f94d13c307440eb34db Apr 21 02:48:56.556559 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.556530 2564 generic.go:358] "Generic (PLEG): container finished" podID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerID="1a903defadf234541e9951d7b685d84eb1d255839e9f029c6e8ebbbf310727c4" exitCode=0 Apr 21 02:48:56.556688 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.556569 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" event={"ID":"6b9875cd-7a08-475a-aee0-931f0e0008f4","Type":"ContainerDied","Data":"1a903defadf234541e9951d7b685d84eb1d255839e9f029c6e8ebbbf310727c4"} Apr 21 02:48:56.558170 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.558147 2564 generic.go:358] "Generic (PLEG): container finished" podID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerID="a24347e1b77b62196a1c06da249666d3d8ba9bd4b208710003edf49807188998" exitCode=0 Apr 21 02:48:56.558288 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.558207 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" event={"ID":"733ca859-ae1f-43e6-9b40-b2ff9828431a","Type":"ContainerDied","Data":"a24347e1b77b62196a1c06da249666d3d8ba9bd4b208710003edf49807188998"} Apr 21 02:48:56.559569 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.559545 2564 generic.go:358] "Generic (PLEG): container finished" podID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerID="29e2de2ccde3b64be4db889ca74520138580acdb8a9b86037d0a5f2b45927099" exitCode=0 Apr 21 02:48:56.559664 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.559605 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" event={"ID":"e43cc539-5833-43d0-a560-b80fc0e4fa0c","Type":"ContainerDied","Data":"29e2de2ccde3b64be4db889ca74520138580acdb8a9b86037d0a5f2b45927099"} Apr 21 02:48:56.559664 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.559625 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" event={"ID":"e43cc539-5833-43d0-a560-b80fc0e4fa0c","Type":"ContainerStarted","Data":"3d004a5b784958eef3f1c8d53db2d531b3c95820b9978546cb40c9d07fc22458"} Apr 21 02:48:56.561131 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.561113 2564 generic.go:358] "Generic (PLEG): container finished" podID="050bdd0b-9b38-4465-a197-62966aab158c" containerID="5f6b61e1092f1cdbdf28dd6757d8b281f0343b773236c6f596a7a803eab243ab" exitCode=0 Apr 21 02:48:56.561241 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.561179 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" event={"ID":"050bdd0b-9b38-4465-a197-62966aab158c","Type":"ContainerDied","Data":"5f6b61e1092f1cdbdf28dd6757d8b281f0343b773236c6f596a7a803eab243ab"} Apr 21 02:48:56.561241 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:56.561200 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" event={"ID":"050bdd0b-9b38-4465-a197-62966aab158c","Type":"ContainerStarted","Data":"9eefea847a1d8bf2025ab2091fd8d84b2e58a618bf725f94d13c307440eb34db"} Apr 21 02:48:57.567365 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.567251 2564 generic.go:358] "Generic (PLEG): container finished" podID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerID="923a043625aaad17d707b445113e04a605fbd5885a775115591d98184448e8a4" exitCode=0 Apr 21 02:48:57.567365 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.567336 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" event={"ID":"733ca859-ae1f-43e6-9b40-b2ff9828431a","Type":"ContainerDied","Data":"923a043625aaad17d707b445113e04a605fbd5885a775115591d98184448e8a4"} Apr 21 02:48:57.569082 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.569058 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" event={"ID":"e43cc539-5833-43d0-a560-b80fc0e4fa0c","Type":"ContainerStarted","Data":"0d3ef29fb96390db0ff65153598c7f4eff60787a3ba9ddcaade26dfd1ec30fca"} Apr 21 02:48:57.570817 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.570797 2564 generic.go:358] "Generic (PLEG): container finished" podID="050bdd0b-9b38-4465-a197-62966aab158c" containerID="397bc1b1bb3c69bb0860f076671a2b2ea4247bbb879b0542cd2aaeae99e1ba5c" exitCode=0 Apr 21 02:48:57.570902 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.570875 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" event={"ID":"050bdd0b-9b38-4465-a197-62966aab158c","Type":"ContainerDied","Data":"397bc1b1bb3c69bb0860f076671a2b2ea4247bbb879b0542cd2aaeae99e1ba5c"} Apr 21 02:48:57.684713 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.684692 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:57.869710 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.869689 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjsk8\" (UniqueName: \"kubernetes.io/projected/6b9875cd-7a08-475a-aee0-931f0e0008f4-kube-api-access-wjsk8\") pod \"6b9875cd-7a08-475a-aee0-931f0e0008f4\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " Apr 21 02:48:57.869855 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.869738 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-bundle\") pod \"6b9875cd-7a08-475a-aee0-931f0e0008f4\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " Apr 21 02:48:57.869855 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.869758 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-util\") pod \"6b9875cd-7a08-475a-aee0-931f0e0008f4\" (UID: \"6b9875cd-7a08-475a-aee0-931f0e0008f4\") " Apr 21 02:48:57.870192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.870155 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-bundle" (OuterVolumeSpecName: "bundle") pod "6b9875cd-7a08-475a-aee0-931f0e0008f4" (UID: "6b9875cd-7a08-475a-aee0-931f0e0008f4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:48:57.871549 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.871530 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9875cd-7a08-475a-aee0-931f0e0008f4-kube-api-access-wjsk8" (OuterVolumeSpecName: "kube-api-access-wjsk8") pod "6b9875cd-7a08-475a-aee0-931f0e0008f4" (UID: "6b9875cd-7a08-475a-aee0-931f0e0008f4"). InnerVolumeSpecName "kube-api-access-wjsk8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:48:57.874725 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.874692 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-util" (OuterVolumeSpecName: "util") pod "6b9875cd-7a08-475a-aee0-931f0e0008f4" (UID: "6b9875cd-7a08-475a-aee0-931f0e0008f4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:48:57.970259 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.970200 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wjsk8\" (UniqueName: \"kubernetes.io/projected/6b9875cd-7a08-475a-aee0-931f0e0008f4-kube-api-access-wjsk8\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:57.970259 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.970219 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:57.970428 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:57.970297 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b9875cd-7a08-475a-aee0-931f0e0008f4-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:58.575898 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.575871 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" Apr 21 02:48:58.575898 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.575885 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d" event={"ID":"6b9875cd-7a08-475a-aee0-931f0e0008f4","Type":"ContainerDied","Data":"827f4445a05e41b0341e56f9104fcd9d924f790376c2d6daf75b0c2c82189ba4"} Apr 21 02:48:58.576394 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.575920 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="827f4445a05e41b0341e56f9104fcd9d924f790376c2d6daf75b0c2c82189ba4" Apr 21 02:48:58.577643 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.577617 2564 generic.go:358] "Generic (PLEG): container finished" podID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerID="0d3ef29fb96390db0ff65153598c7f4eff60787a3ba9ddcaade26dfd1ec30fca" exitCode=0 Apr 21 02:48:58.577760 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.577698 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" event={"ID":"e43cc539-5833-43d0-a560-b80fc0e4fa0c","Type":"ContainerDied","Data":"0d3ef29fb96390db0ff65153598c7f4eff60787a3ba9ddcaade26dfd1ec30fca"} Apr 21 02:48:58.579817 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.579794 2564 generic.go:358] "Generic (PLEG): container finished" podID="050bdd0b-9b38-4465-a197-62966aab158c" containerID="4c0be185baa15474467f920e1464177b7fbe6ced2051ed5087593e35d0e2033c" exitCode=0 Apr 21 02:48:58.579900 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.579824 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" event={"ID":"050bdd0b-9b38-4465-a197-62966aab158c","Type":"ContainerDied","Data":"4c0be185baa15474467f920e1464177b7fbe6ced2051ed5087593e35d0e2033c"} Apr 21 02:48:58.720404 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.720382 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:58.876543 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.876458 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-bundle\") pod \"733ca859-ae1f-43e6-9b40-b2ff9828431a\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " Apr 21 02:48:58.876543 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.876539 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbvhh\" (UniqueName: \"kubernetes.io/projected/733ca859-ae1f-43e6-9b40-b2ff9828431a-kube-api-access-fbvhh\") pod \"733ca859-ae1f-43e6-9b40-b2ff9828431a\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " Apr 21 02:48:58.876733 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.876562 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-util\") pod \"733ca859-ae1f-43e6-9b40-b2ff9828431a\" (UID: \"733ca859-ae1f-43e6-9b40-b2ff9828431a\") " Apr 21 02:48:58.877199 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.877164 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-bundle" (OuterVolumeSpecName: "bundle") pod "733ca859-ae1f-43e6-9b40-b2ff9828431a" (UID: "733ca859-ae1f-43e6-9b40-b2ff9828431a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:48:58.878888 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.878859 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733ca859-ae1f-43e6-9b40-b2ff9828431a-kube-api-access-fbvhh" (OuterVolumeSpecName: "kube-api-access-fbvhh") pod "733ca859-ae1f-43e6-9b40-b2ff9828431a" (UID: "733ca859-ae1f-43e6-9b40-b2ff9828431a"). InnerVolumeSpecName "kube-api-access-fbvhh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:48:58.882358 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.882321 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-util" (OuterVolumeSpecName: "util") pod "733ca859-ae1f-43e6-9b40-b2ff9828431a" (UID: "733ca859-ae1f-43e6-9b40-b2ff9828431a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:48:58.977831 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.977799 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbvhh\" (UniqueName: \"kubernetes.io/projected/733ca859-ae1f-43e6-9b40-b2ff9828431a-kube-api-access-fbvhh\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:58.977831 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.977829 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:58.977831 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:58.977839 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/733ca859-ae1f-43e6-9b40-b2ff9828431a-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:59.585272 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.585231 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" event={"ID":"733ca859-ae1f-43e6-9b40-b2ff9828431a","Type":"ContainerDied","Data":"3fb0102161e188380821566b56712b2152039e15717680e89845ea2262f93894"} Apr 21 02:48:59.585683 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.585279 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb0102161e188380821566b56712b2152039e15717680e89845ea2262f93894" Apr 21 02:48:59.585683 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.585255 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw" Apr 21 02:48:59.587144 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.587115 2564 generic.go:358] "Generic (PLEG): container finished" podID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerID="be5e5c60b6e2a6c7d825b2e4f93636eecb476175e181b8e4b8b670c82633f618" exitCode=0 Apr 21 02:48:59.587298 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.587146 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" event={"ID":"e43cc539-5833-43d0-a560-b80fc0e4fa0c","Type":"ContainerDied","Data":"be5e5c60b6e2a6c7d825b2e4f93636eecb476175e181b8e4b8b670c82633f618"} Apr 21 02:48:59.726472 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.726450 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:48:59.884201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.884124 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-util\") pod \"050bdd0b-9b38-4465-a197-62966aab158c\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " Apr 21 02:48:59.884201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.884173 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-bundle\") pod \"050bdd0b-9b38-4465-a197-62966aab158c\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " Apr 21 02:48:59.884201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.884191 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t4kl\" (UniqueName: \"kubernetes.io/projected/050bdd0b-9b38-4465-a197-62966aab158c-kube-api-access-2t4kl\") pod \"050bdd0b-9b38-4465-a197-62966aab158c\" (UID: \"050bdd0b-9b38-4465-a197-62966aab158c\") " Apr 21 02:48:59.884718 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.884683 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-bundle" (OuterVolumeSpecName: "bundle") pod "050bdd0b-9b38-4465-a197-62966aab158c" (UID: "050bdd0b-9b38-4465-a197-62966aab158c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:48:59.886248 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.886226 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050bdd0b-9b38-4465-a197-62966aab158c-kube-api-access-2t4kl" (OuterVolumeSpecName: "kube-api-access-2t4kl") pod "050bdd0b-9b38-4465-a197-62966aab158c" (UID: "050bdd0b-9b38-4465-a197-62966aab158c"). InnerVolumeSpecName "kube-api-access-2t4kl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:48:59.889527 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.889491 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-util" (OuterVolumeSpecName: "util") pod "050bdd0b-9b38-4465-a197-62966aab158c" (UID: "050bdd0b-9b38-4465-a197-62966aab158c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:48:59.985355 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.985326 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:59.985355 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.985349 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/050bdd0b-9b38-4465-a197-62966aab158c-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:48:59.985545 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:48:59.985362 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2t4kl\" (UniqueName: \"kubernetes.io/projected/050bdd0b-9b38-4465-a197-62966aab158c-kube-api-access-2t4kl\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:49:00.593245 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.593212 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" Apr 21 02:49:00.593245 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.593212 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz" event={"ID":"050bdd0b-9b38-4465-a197-62966aab158c","Type":"ContainerDied","Data":"9eefea847a1d8bf2025ab2091fd8d84b2e58a618bf725f94d13c307440eb34db"} Apr 21 02:49:00.593796 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.593257 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eefea847a1d8bf2025ab2091fd8d84b2e58a618bf725f94d13c307440eb34db" Apr 21 02:49:00.719158 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.719136 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:49:00.892128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.892061 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-bundle\") pod \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " Apr 21 02:49:00.892128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.892107 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d22d\" (UniqueName: \"kubernetes.io/projected/e43cc539-5833-43d0-a560-b80fc0e4fa0c-kube-api-access-2d22d\") pod \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " Apr 21 02:49:00.892128 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.892127 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-util\") pod \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\" (UID: \"e43cc539-5833-43d0-a560-b80fc0e4fa0c\") " Apr 21 02:49:00.892588 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.892565 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-bundle" (OuterVolumeSpecName: "bundle") pod "e43cc539-5833-43d0-a560-b80fc0e4fa0c" (UID: "e43cc539-5833-43d0-a560-b80fc0e4fa0c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:49:00.894169 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.894151 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43cc539-5833-43d0-a560-b80fc0e4fa0c-kube-api-access-2d22d" (OuterVolumeSpecName: "kube-api-access-2d22d") pod "e43cc539-5833-43d0-a560-b80fc0e4fa0c" (UID: "e43cc539-5833-43d0-a560-b80fc0e4fa0c"). InnerVolumeSpecName "kube-api-access-2d22d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:49:00.897697 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.897677 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-util" (OuterVolumeSpecName: "util") pod "e43cc539-5833-43d0-a560-b80fc0e4fa0c" (UID: "e43cc539-5833-43d0-a560-b80fc0e4fa0c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:49:00.992704 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.992671 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2d22d\" (UniqueName: \"kubernetes.io/projected/e43cc539-5833-43d0-a560-b80fc0e4fa0c-kube-api-access-2d22d\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:49:00.992704 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.992696 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:49:00.992704 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:00.992710 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e43cc539-5833-43d0-a560-b80fc0e4fa0c-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:49:01.598825 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:01.598792 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" event={"ID":"e43cc539-5833-43d0-a560-b80fc0e4fa0c","Type":"ContainerDied","Data":"3d004a5b784958eef3f1c8d53db2d531b3c95820b9978546cb40c9d07fc22458"} Apr 21 02:49:01.598825 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:01.598828 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d004a5b784958eef3f1c8d53db2d531b3c95820b9978546cb40c9d07fc22458" Apr 21 02:49:01.599228 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:01.598837 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd" Apr 21 02:49:10.943067 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943036 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-nc6mj"] Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943364 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="050bdd0b-9b38-4465-a197-62966aab158c" containerName="pull" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943374 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="050bdd0b-9b38-4465-a197-62966aab158c" containerName="pull" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943388 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerName="util" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943394 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerName="util" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943401 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerName="util" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943407 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerName="util" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943413 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerName="extract" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943418 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerName="extract" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943426 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerName="pull" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943431 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerName="pull" Apr 21 02:49:10.943435 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943440 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerName="util" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943445 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerName="util" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943450 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerName="pull" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943455 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerName="pull" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943461 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerName="pull" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943466 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerName="pull" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943474 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerName="extract" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943480 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerName="extract" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943487 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="050bdd0b-9b38-4465-a197-62966aab158c" containerName="util" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943492 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="050bdd0b-9b38-4465-a197-62966aab158c" containerName="util" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943521 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerName="extract" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943526 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerName="extract" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943531 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="050bdd0b-9b38-4465-a197-62966aab158c" containerName="extract" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943536 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="050bdd0b-9b38-4465-a197-62966aab158c" containerName="extract" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943591 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="050bdd0b-9b38-4465-a197-62966aab158c" containerName="extract" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943601 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b9875cd-7a08-475a-aee0-931f0e0008f4" containerName="extract" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943607 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="733ca859-ae1f-43e6-9b40-b2ff9828431a" containerName="extract" Apr 21 02:49:10.943774 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.943613 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e43cc539-5833-43d0-a560-b80fc0e4fa0c" containerName="extract" Apr 21 02:49:10.951602 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.951584 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" Apr 21 02:49:10.955029 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.955004 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-lkhkx\"" Apr 21 02:49:10.975282 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:10.975262 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-nc6mj"] Apr 21 02:49:11.061490 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:11.061467 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb5q8\" (UniqueName: \"kubernetes.io/projected/f1841a2f-46f4-4fa8-8a78-0d4bb4dada45-kube-api-access-cb5q8\") pod \"authorino-operator-657f44b778-nc6mj\" (UID: \"f1841a2f-46f4-4fa8-8a78-0d4bb4dada45\") " pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" Apr 21 02:49:11.161882 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:11.161853 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb5q8\" (UniqueName: \"kubernetes.io/projected/f1841a2f-46f4-4fa8-8a78-0d4bb4dada45-kube-api-access-cb5q8\") pod \"authorino-operator-657f44b778-nc6mj\" (UID: \"f1841a2f-46f4-4fa8-8a78-0d4bb4dada45\") " pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" Apr 21 02:49:11.171353 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:11.171329 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb5q8\" (UniqueName: \"kubernetes.io/projected/f1841a2f-46f4-4fa8-8a78-0d4bb4dada45-kube-api-access-cb5q8\") pod \"authorino-operator-657f44b778-nc6mj\" (UID: \"f1841a2f-46f4-4fa8-8a78-0d4bb4dada45\") " pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" Apr 21 02:49:11.261337 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:11.261314 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" Apr 21 02:49:11.383876 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:11.383853 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-nc6mj"] Apr 21 02:49:11.385902 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:49:11.385871 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1841a2f_46f4_4fa8_8a78_0d4bb4dada45.slice/crio-bc7ed0888af98ff493fc63806f83ce08ae1032279115246901231d77abf54a54 WatchSource:0}: Error finding container bc7ed0888af98ff493fc63806f83ce08ae1032279115246901231d77abf54a54: Status 404 returned error can't find the container with id bc7ed0888af98ff493fc63806f83ce08ae1032279115246901231d77abf54a54 Apr 21 02:49:11.640063 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:11.640002 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" event={"ID":"f1841a2f-46f4-4fa8-8a78-0d4bb4dada45","Type":"ContainerStarted","Data":"bc7ed0888af98ff493fc63806f83ce08ae1032279115246901231d77abf54a54"} Apr 21 02:49:14.652445 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:14.652408 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" event={"ID":"f1841a2f-46f4-4fa8-8a78-0d4bb4dada45","Type":"ContainerStarted","Data":"84c97188a5720e2d1396cc752cb74994366fcb20744f90d72b64bcd5a9391c32"} Apr 21 02:49:14.652833 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:14.652534 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" Apr 21 02:49:14.671337 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:14.671287 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" podStartSLOduration=2.448972992 podStartE2EDuration="4.671271971s" podCreationTimestamp="2026-04-21 02:49:10 +0000 UTC" firstStartedPulling="2026-04-21 02:49:11.387878697 +0000 UTC m=+484.155367378" lastFinishedPulling="2026-04-21 02:49:13.61017769 +0000 UTC m=+486.377666357" observedRunningTime="2026-04-21 02:49:14.66981568 +0000 UTC m=+487.437304380" watchObservedRunningTime="2026-04-21 02:49:14.671271971 +0000 UTC m=+487.438760661" Apr 21 02:49:22.552074 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.552040 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl"] Apr 21 02:49:22.555709 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.555687 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:22.558238 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.558217 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-tkql6\"" Apr 21 02:49:22.567245 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.567226 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl"] Apr 21 02:49:22.661562 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.661537 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blclp\" (UniqueName: \"kubernetes.io/projected/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-kube-api-access-blclp\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" (UID: \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:22.661663 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.661609 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" (UID: \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:22.762201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.762175 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" (UID: \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:22.762300 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.762222 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blclp\" (UniqueName: \"kubernetes.io/projected/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-kube-api-access-blclp\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" (UID: \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:22.762557 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.762538 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" (UID: \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:22.771177 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.771157 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blclp\" (UniqueName: \"kubernetes.io/projected/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-kube-api-access-blclp\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" (UID: \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:22.865775 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.865720 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:22.990119 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:22.990093 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl"] Apr 21 02:49:22.991938 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:49:22.991912 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda19724f9_fdf9_438a_9cf3_2e20e54c3b37.slice/crio-7c4d23d062ee5c2d3326d1c122dcf86c5b85d0349e445ec8199eeef158d5c0b3 WatchSource:0}: Error finding container 7c4d23d062ee5c2d3326d1c122dcf86c5b85d0349e445ec8199eeef158d5c0b3: Status 404 returned error can't find the container with id 7c4d23d062ee5c2d3326d1c122dcf86c5b85d0349e445ec8199eeef158d5c0b3 Apr 21 02:49:23.687800 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:23.687762 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" event={"ID":"a19724f9-fdf9-438a-9cf3-2e20e54c3b37","Type":"ContainerStarted","Data":"7c4d23d062ee5c2d3326d1c122dcf86c5b85d0349e445ec8199eeef158d5c0b3"} Apr 21 02:49:25.659137 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:25.659106 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-nc6mj" Apr 21 02:49:28.711150 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:28.711116 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" event={"ID":"a19724f9-fdf9-438a-9cf3-2e20e54c3b37","Type":"ContainerStarted","Data":"bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4"} Apr 21 02:49:28.711525 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:28.711240 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:28.734859 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:28.734817 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" podStartSLOduration=1.6850770069999998 podStartE2EDuration="6.734802475s" podCreationTimestamp="2026-04-21 02:49:22 +0000 UTC" firstStartedPulling="2026-04-21 02:49:22.995009172 +0000 UTC m=+495.762497839" lastFinishedPulling="2026-04-21 02:49:28.044734626 +0000 UTC m=+500.812223307" observedRunningTime="2026-04-21 02:49:28.734424426 +0000 UTC m=+501.501913124" watchObservedRunningTime="2026-04-21 02:49:28.734802475 +0000 UTC m=+501.502291163" Apr 21 02:49:31.393878 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.393840 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bcfbcf5f6-p6r48"] Apr 21 02:49:31.398031 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.398002 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.408530 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.408487 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcfbcf5f6-p6r48"] Apr 21 02:49:31.439925 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.439902 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-console-config\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.440046 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.439941 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-trusted-ca-bundle\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.440046 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.439983 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb89eed7-05bb-4217-a98b-d918f675868f-console-oauth-config\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.440046 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.440006 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-oauth-serving-cert\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.440046 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.440022 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2h9t\" (UniqueName: \"kubernetes.io/projected/eb89eed7-05bb-4217-a98b-d918f675868f-kube-api-access-t2h9t\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.440197 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.440044 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-service-ca\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.440197 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.440145 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb89eed7-05bb-4217-a98b-d918f675868f-console-serving-cert\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.540645 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.540616 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-console-config\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.540762 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.540657 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-trusted-ca-bundle\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.540762 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.540681 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb89eed7-05bb-4217-a98b-d918f675868f-console-oauth-config\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.540762 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.540704 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-oauth-serving-cert\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.540926 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.540829 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2h9t\" (UniqueName: \"kubernetes.io/projected/eb89eed7-05bb-4217-a98b-d918f675868f-kube-api-access-t2h9t\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.540926 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.540876 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-service-ca\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.541027 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.540967 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb89eed7-05bb-4217-a98b-d918f675868f-console-serving-cert\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.541463 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.541433 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-oauth-serving-cert\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.541595 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.541512 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-console-config\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.541595 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.541572 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-trusted-ca-bundle\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.541687 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.541625 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb89eed7-05bb-4217-a98b-d918f675868f-service-ca\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.543638 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.543619 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb89eed7-05bb-4217-a98b-d918f675868f-console-serving-cert\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.543727 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.543649 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb89eed7-05bb-4217-a98b-d918f675868f-console-oauth-config\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.557835 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.557814 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2h9t\" (UniqueName: \"kubernetes.io/projected/eb89eed7-05bb-4217-a98b-d918f675868f-kube-api-access-t2h9t\") pod \"console-5bcfbcf5f6-p6r48\" (UID: \"eb89eed7-05bb-4217-a98b-d918f675868f\") " pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.708112 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.708058 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:31.837160 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:31.837137 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcfbcf5f6-p6r48"] Apr 21 02:49:31.838786 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:49:31.838760 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb89eed7_05bb_4217_a98b_d918f675868f.slice/crio-b18976ec8acbe57b3dbe28e530116aa2abfc7a16a175aac48c10e63de1148fde WatchSource:0}: Error finding container b18976ec8acbe57b3dbe28e530116aa2abfc7a16a175aac48c10e63de1148fde: Status 404 returned error can't find the container with id b18976ec8acbe57b3dbe28e530116aa2abfc7a16a175aac48c10e63de1148fde Apr 21 02:49:32.726716 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:32.726682 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcfbcf5f6-p6r48" event={"ID":"eb89eed7-05bb-4217-a98b-d918f675868f","Type":"ContainerStarted","Data":"69304a3fd31c40f704c2bf417183083709e7543b7ca013b96b26d9b6e27fa87a"} Apr 21 02:49:32.726716 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:32.726714 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcfbcf5f6-p6r48" event={"ID":"eb89eed7-05bb-4217-a98b-d918f675868f","Type":"ContainerStarted","Data":"b18976ec8acbe57b3dbe28e530116aa2abfc7a16a175aac48c10e63de1148fde"} Apr 21 02:49:32.745250 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:32.745209 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bcfbcf5f6-p6r48" podStartSLOduration=1.745195118 podStartE2EDuration="1.745195118s" podCreationTimestamp="2026-04-21 02:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:49:32.744034855 +0000 UTC m=+505.511523545" watchObservedRunningTime="2026-04-21 02:49:32.745195118 +0000 UTC m=+505.512683808" Apr 21 02:49:39.716877 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:39.716842 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:41.310470 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.310434 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl"] Apr 21 02:49:41.310996 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.310723 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" containerName="manager" containerID="cri-o://bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4" gracePeriod=2 Apr 21 02:49:41.312921 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.312882 2564 status_manager.go:895] "Failed to get status for pod" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" is forbidden: User \"system:node:ip-10-0-131-170.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-170.ec2.internal' and this object" Apr 21 02:49:41.313927 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.313901 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl"] Apr 21 02:49:41.337940 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.337917 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7"] Apr 21 02:49:41.338334 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.338317 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" containerName="manager" Apr 21 02:49:41.338483 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.338336 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" containerName="manager" Apr 21 02:49:41.338483 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.338429 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" containerName="manager" Apr 21 02:49:41.341551 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.341532 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:41.357917 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.357889 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7"] Apr 21 02:49:41.364302 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.364279 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc"] Apr 21 02:49:41.367521 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.367507 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" Apr 21 02:49:41.370665 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.370644 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-sqn25\"" Apr 21 02:49:41.375598 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.375573 2564 status_manager.go:895] "Failed to get status for pod" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" is forbidden: User \"system:node:ip-10-0-131-170.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-170.ec2.internal' and this object" Apr 21 02:49:41.378233 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.378117 2564 status_manager.go:895] "Failed to get status for pod" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" is forbidden: User \"system:node:ip-10-0-131-170.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-170.ec2.internal' and this object" Apr 21 02:49:41.381990 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.381968 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc"] Apr 21 02:49:41.417130 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.417106 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ad27acb-1236-4d1f-b739-c7e36333d941-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-xk4h7\" (UID: \"6ad27acb-1236-4d1f-b739-c7e36333d941\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:41.417244 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.417141 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdldl\" (UniqueName: \"kubernetes.io/projected/6ad27acb-1236-4d1f-b739-c7e36333d941-kube-api-access-kdldl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-xk4h7\" (UID: \"6ad27acb-1236-4d1f-b739-c7e36333d941\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:41.417244 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.417178 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdds6\" (UniqueName: \"kubernetes.io/projected/02dd9222-a726-441e-a972-b99b04438f38-kube-api-access-zdds6\") pod \"limitador-operator-controller-manager-85c4996f8c-2prqc\" (UID: \"02dd9222-a726-441e-a972-b99b04438f38\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" Apr 21 02:49:41.518206 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.518168 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ad27acb-1236-4d1f-b739-c7e36333d941-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-xk4h7\" (UID: \"6ad27acb-1236-4d1f-b739-c7e36333d941\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:41.518352 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.518214 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdldl\" (UniqueName: \"kubernetes.io/projected/6ad27acb-1236-4d1f-b739-c7e36333d941-kube-api-access-kdldl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-xk4h7\" (UID: \"6ad27acb-1236-4d1f-b739-c7e36333d941\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:41.518352 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.518256 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdds6\" (UniqueName: \"kubernetes.io/projected/02dd9222-a726-441e-a972-b99b04438f38-kube-api-access-zdds6\") pod \"limitador-operator-controller-manager-85c4996f8c-2prqc\" (UID: \"02dd9222-a726-441e-a972-b99b04438f38\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" Apr 21 02:49:41.518634 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.518609 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ad27acb-1236-4d1f-b739-c7e36333d941-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-xk4h7\" (UID: \"6ad27acb-1236-4d1f-b739-c7e36333d941\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:41.526388 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.526365 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdds6\" (UniqueName: \"kubernetes.io/projected/02dd9222-a726-441e-a972-b99b04438f38-kube-api-access-zdds6\") pod \"limitador-operator-controller-manager-85c4996f8c-2prqc\" (UID: \"02dd9222-a726-441e-a972-b99b04438f38\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" Apr 21 02:49:41.527034 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.527005 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdldl\" (UniqueName: \"kubernetes.io/projected/6ad27acb-1236-4d1f-b739-c7e36333d941-kube-api-access-kdldl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-xk4h7\" (UID: \"6ad27acb-1236-4d1f-b739-c7e36333d941\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:41.547269 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.547251 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:41.549323 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.549300 2564 status_manager.go:895] "Failed to get status for pod" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" is forbidden: User \"system:node:ip-10-0-131-170.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-170.ec2.internal' and this object" Apr 21 02:49:41.619553 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.619491 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blclp\" (UniqueName: \"kubernetes.io/projected/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-kube-api-access-blclp\") pod \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\" (UID: \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\") " Apr 21 02:49:41.619632 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.619553 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-extensions-socket-volume\") pod \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\" (UID: \"a19724f9-fdf9-438a-9cf3-2e20e54c3b37\") " Apr 21 02:49:41.620028 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.620009 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "a19724f9-fdf9-438a-9cf3-2e20e54c3b37" (UID: "a19724f9-fdf9-438a-9cf3-2e20e54c3b37"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:49:41.621369 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.621350 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-kube-api-access-blclp" (OuterVolumeSpecName: "kube-api-access-blclp") pod "a19724f9-fdf9-438a-9cf3-2e20e54c3b37" (UID: "a19724f9-fdf9-438a-9cf3-2e20e54c3b37"). InnerVolumeSpecName "kube-api-access-blclp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:49:41.688344 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.688321 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:41.696974 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.696948 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" Apr 21 02:49:41.708644 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.708624 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:41.708750 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.708663 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:41.713867 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.713845 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:41.720149 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.720130 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-blclp\" (UniqueName: \"kubernetes.io/projected/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-kube-api-access-blclp\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:49:41.720149 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.720150 2564 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a19724f9-fdf9-438a-9cf3-2e20e54c3b37-extensions-socket-volume\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:49:41.752777 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.752736 2564 status_manager.go:895] "Failed to get status for pod" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" is forbidden: User \"system:node:ip-10-0-131-170.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-170.ec2.internal' and this object" Apr 21 02:49:41.770102 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.770070 2564 generic.go:358] "Generic (PLEG): container finished" podID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" containerID="bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4" exitCode=0 Apr 21 02:49:41.770221 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.770113 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" Apr 21 02:49:41.770221 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.770176 2564 scope.go:117] "RemoveContainer" containerID="bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4" Apr 21 02:49:41.772826 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.772778 2564 status_manager.go:895] "Failed to get status for pod" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" is forbidden: User \"system:node:ip-10-0-131-170.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-170.ec2.internal' and this object" Apr 21 02:49:41.776354 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.776332 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bcfbcf5f6-p6r48" Apr 21 02:49:41.778924 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.778898 2564 status_manager.go:895] "Failed to get status for pod" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" is forbidden: User \"system:node:ip-10-0-131-170.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-170.ec2.internal' and this object" Apr 21 02:49:41.782307 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.782289 2564 scope.go:117] "RemoveContainer" containerID="bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4" Apr 21 02:49:41.782584 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:49:41.782565 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4\": container with ID starting with bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4 not found: ID does not exist" containerID="bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4" Apr 21 02:49:41.782656 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.782591 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4"} err="failed to get container status \"bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4\": rpc error: code = NotFound desc = could not find container \"bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4\": container with ID starting with bd21a811358c8284bc782162091eaf2b393917c5a4188d891b7cb504f60d61b4 not found: ID does not exist" Apr 21 02:49:41.815990 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.815960 2564 status_manager.go:895] "Failed to get status for pod" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whftl" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-whftl\" is forbidden: User \"system:node:ip-10-0-131-170.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-170.ec2.internal' and this object" Apr 21 02:49:41.836368 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.836321 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7"] Apr 21 02:49:41.840027 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:49:41.839985 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ad27acb_1236_4d1f_b739_c7e36333d941.slice/crio-81422b27fc5d643ba3b1ca766c0d800e4b953ce9a529c3e8ce2ee44877c5b167 WatchSource:0}: Error finding container 81422b27fc5d643ba3b1ca766c0d800e4b953ce9a529c3e8ce2ee44877c5b167: Status 404 returned error can't find the container with id 81422b27fc5d643ba3b1ca766c0d800e4b953ce9a529c3e8ce2ee44877c5b167 Apr 21 02:49:41.845960 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.845933 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19724f9-fdf9-438a-9cf3-2e20e54c3b37" path="/var/lib/kubelet/pods/a19724f9-fdf9-438a-9cf3-2e20e54c3b37/volumes" Apr 21 02:49:41.860658 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.860631 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68954b648f-pdbw4"] Apr 21 02:49:41.866227 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:41.866203 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc"] Apr 21 02:49:41.869372 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:49:41.869333 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02dd9222_a726_441e_a972_b99b04438f38.slice/crio-5a374538ad07ae45c1e9c61199c160b546276b1df82597d64d631e25f4380e94 WatchSource:0}: Error finding container 5a374538ad07ae45c1e9c61199c160b546276b1df82597d64d631e25f4380e94: Status 404 returned error can't find the container with id 5a374538ad07ae45c1e9c61199c160b546276b1df82597d64d631e25f4380e94 Apr 21 02:49:42.775896 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:42.775863 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" event={"ID":"6ad27acb-1236-4d1f-b739-c7e36333d941","Type":"ContainerStarted","Data":"e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5"} Apr 21 02:49:42.775896 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:42.775899 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" event={"ID":"6ad27acb-1236-4d1f-b739-c7e36333d941","Type":"ContainerStarted","Data":"81422b27fc5d643ba3b1ca766c0d800e4b953ce9a529c3e8ce2ee44877c5b167"} Apr 21 02:49:42.776383 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:42.775980 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:42.778228 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:42.778190 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" event={"ID":"02dd9222-a726-441e-a972-b99b04438f38","Type":"ContainerStarted","Data":"5a374538ad07ae45c1e9c61199c160b546276b1df82597d64d631e25f4380e94"} Apr 21 02:49:42.799198 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:42.799148 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" podStartSLOduration=1.799131852 podStartE2EDuration="1.799131852s" podCreationTimestamp="2026-04-21 02:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:49:42.794790786 +0000 UTC m=+515.562279482" watchObservedRunningTime="2026-04-21 02:49:42.799131852 +0000 UTC m=+515.566620545" Apr 21 02:49:43.783581 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:43.783543 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" event={"ID":"02dd9222-a726-441e-a972-b99b04438f38","Type":"ContainerStarted","Data":"7355f20538c91e504bdb1fd11dc3bc9a6728bbaa5c9b5700d7541b46a333b147"} Apr 21 02:49:43.783968 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:43.783608 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" Apr 21 02:49:43.801888 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:43.801845 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" podStartSLOduration=1.626863309 podStartE2EDuration="2.801831674s" podCreationTimestamp="2026-04-21 02:49:41 +0000 UTC" firstStartedPulling="2026-04-21 02:49:41.871863893 +0000 UTC m=+514.639352567" lastFinishedPulling="2026-04-21 02:49:43.046832257 +0000 UTC m=+515.814320932" observedRunningTime="2026-04-21 02:49:43.800861935 +0000 UTC m=+516.568350625" watchObservedRunningTime="2026-04-21 02:49:43.801831674 +0000 UTC m=+516.569320363" Apr 21 02:49:53.786174 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:53.786096 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:54.789564 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:54.789535 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2prqc" Apr 21 02:49:56.559973 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.559939 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7"] Apr 21 02:49:56.560424 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.560214 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" podUID="6ad27acb-1236-4d1f-b739-c7e36333d941" containerName="manager" containerID="cri-o://e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5" gracePeriod=10 Apr 21 02:49:56.801874 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.801849 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599"] Apr 21 02:49:56.805414 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.805399 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:49:56.811878 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.811833 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:56.824957 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.824938 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599"] Apr 21 02:49:56.838170 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.837973 2564 generic.go:358] "Generic (PLEG): container finished" podID="6ad27acb-1236-4d1f-b739-c7e36333d941" containerID="e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5" exitCode=0 Apr 21 02:49:56.838170 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.838054 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" event={"ID":"6ad27acb-1236-4d1f-b739-c7e36333d941","Type":"ContainerDied","Data":"e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5"} Apr 21 02:49:56.838170 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.838123 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" event={"ID":"6ad27acb-1236-4d1f-b739-c7e36333d941","Type":"ContainerDied","Data":"81422b27fc5d643ba3b1ca766c0d800e4b953ce9a529c3e8ce2ee44877c5b167"} Apr 21 02:49:56.838170 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.838148 2564 scope.go:117] "RemoveContainer" containerID="e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5" Apr 21 02:49:56.838424 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.838382 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7" Apr 21 02:49:56.847263 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.847244 2564 scope.go:117] "RemoveContainer" containerID="e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5" Apr 21 02:49:56.847521 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:49:56.847490 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5\": container with ID starting with e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5 not found: ID does not exist" containerID="e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5" Apr 21 02:49:56.847583 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.847530 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5"} err="failed to get container status \"e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5\": rpc error: code = NotFound desc = could not find container \"e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5\": container with ID starting with e24ca11afaf1c876e21f9d53c52f2150932acecb3ca7d2d1ce7ba9039aff52b5 not found: ID does not exist" Apr 21 02:49:56.950766 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.950745 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdldl\" (UniqueName: \"kubernetes.io/projected/6ad27acb-1236-4d1f-b739-c7e36333d941-kube-api-access-kdldl\") pod \"6ad27acb-1236-4d1f-b739-c7e36333d941\" (UID: \"6ad27acb-1236-4d1f-b739-c7e36333d941\") " Apr 21 02:49:56.950875 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.950780 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ad27acb-1236-4d1f-b739-c7e36333d941-extensions-socket-volume\") pod \"6ad27acb-1236-4d1f-b739-c7e36333d941\" (UID: \"6ad27acb-1236-4d1f-b739-c7e36333d941\") " Apr 21 02:49:56.950940 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.950924 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b00efc7b-4cd0-48ea-933b-e8bd91ebef92-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pj599\" (UID: \"b00efc7b-4cd0-48ea-933b-e8bd91ebef92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:49:56.950982 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.950948 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsr8s\" (UniqueName: \"kubernetes.io/projected/b00efc7b-4cd0-48ea-933b-e8bd91ebef92-kube-api-access-zsr8s\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pj599\" (UID: \"b00efc7b-4cd0-48ea-933b-e8bd91ebef92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:49:56.951203 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.951178 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ad27acb-1236-4d1f-b739-c7e36333d941-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "6ad27acb-1236-4d1f-b739-c7e36333d941" (UID: "6ad27acb-1236-4d1f-b739-c7e36333d941"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:49:56.952808 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:56.952785 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad27acb-1236-4d1f-b739-c7e36333d941-kube-api-access-kdldl" (OuterVolumeSpecName: "kube-api-access-kdldl") pod "6ad27acb-1236-4d1f-b739-c7e36333d941" (UID: "6ad27acb-1236-4d1f-b739-c7e36333d941"). InnerVolumeSpecName "kube-api-access-kdldl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:49:57.051876 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.051852 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b00efc7b-4cd0-48ea-933b-e8bd91ebef92-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pj599\" (UID: \"b00efc7b-4cd0-48ea-933b-e8bd91ebef92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:49:57.051973 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.051882 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsr8s\" (UniqueName: \"kubernetes.io/projected/b00efc7b-4cd0-48ea-933b-e8bd91ebef92-kube-api-access-zsr8s\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pj599\" (UID: \"b00efc7b-4cd0-48ea-933b-e8bd91ebef92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:49:57.052015 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.051973 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdldl\" (UniqueName: \"kubernetes.io/projected/6ad27acb-1236-4d1f-b739-c7e36333d941-kube-api-access-kdldl\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:49:57.052015 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.051987 2564 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ad27acb-1236-4d1f-b739-c7e36333d941-extensions-socket-volume\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:49:57.052198 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.052180 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b00efc7b-4cd0-48ea-933b-e8bd91ebef92-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pj599\" (UID: \"b00efc7b-4cd0-48ea-933b-e8bd91ebef92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:49:57.060031 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.060011 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsr8s\" (UniqueName: \"kubernetes.io/projected/b00efc7b-4cd0-48ea-933b-e8bd91ebef92-kube-api-access-zsr8s\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pj599\" (UID: \"b00efc7b-4cd0-48ea-933b-e8bd91ebef92\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:49:57.120163 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.120108 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:49:57.170437 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.170409 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7"] Apr 21 02:49:57.175795 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.175742 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-xk4h7"] Apr 21 02:49:57.247367 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.247345 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599"] Apr 21 02:49:57.249019 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:49:57.248990 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00efc7b_4cd0_48ea_933b_e8bd91ebef92.slice/crio-a77e5b0527f396b89a5f48f6c0152149b38c29ac55401322a9862795bec4ac01 WatchSource:0}: Error finding container a77e5b0527f396b89a5f48f6c0152149b38c29ac55401322a9862795bec4ac01: Status 404 returned error can't find the container with id a77e5b0527f396b89a5f48f6c0152149b38c29ac55401322a9862795bec4ac01 Apr 21 02:49:57.852440 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.852409 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad27acb-1236-4d1f-b739-c7e36333d941" path="/var/lib/kubelet/pods/6ad27acb-1236-4d1f-b739-c7e36333d941/volumes" Apr 21 02:49:57.852832 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.852695 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:49:57.852832 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.852711 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" event={"ID":"b00efc7b-4cd0-48ea-933b-e8bd91ebef92","Type":"ContainerStarted","Data":"207a1b90e94eefd451b3caac67af69c6f32326b35e50d916f1a7c54f95c922e7"} Apr 21 02:49:57.852832 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.852728 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" event={"ID":"b00efc7b-4cd0-48ea-933b-e8bd91ebef92","Type":"ContainerStarted","Data":"a77e5b0527f396b89a5f48f6c0152149b38c29ac55401322a9862795bec4ac01"} Apr 21 02:49:57.885787 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:49:57.885742 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" podStartSLOduration=1.885730402 podStartE2EDuration="1.885730402s" podCreationTimestamp="2026-04-21 02:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:49:57.884083917 +0000 UTC m=+530.651572607" watchObservedRunningTime="2026-04-21 02:49:57.885730402 +0000 UTC m=+530.653219090" Apr 21 02:50:06.885221 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:06.885183 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68954b648f-pdbw4" podUID="464ddfa0-b507-4644-b148-9f35f5f15c98" containerName="console" containerID="cri-o://8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c" gracePeriod=15 Apr 21 02:50:07.121232 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.121208 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68954b648f-pdbw4_464ddfa0-b507-4644-b148-9f35f5f15c98/console/0.log" Apr 21 02:50:07.121335 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.121267 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:50:07.238352 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238280 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhdm9\" (UniqueName: \"kubernetes.io/projected/464ddfa0-b507-4644-b148-9f35f5f15c98-kube-api-access-hhdm9\") pod \"464ddfa0-b507-4644-b148-9f35f5f15c98\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " Apr 21 02:50:07.238352 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238341 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-oauth-serving-cert\") pod \"464ddfa0-b507-4644-b148-9f35f5f15c98\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " Apr 21 02:50:07.238584 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238359 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-console-config\") pod \"464ddfa0-b507-4644-b148-9f35f5f15c98\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " Apr 21 02:50:07.238584 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238397 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-trusted-ca-bundle\") pod \"464ddfa0-b507-4644-b148-9f35f5f15c98\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " Apr 21 02:50:07.238584 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238431 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-serving-cert\") pod \"464ddfa0-b507-4644-b148-9f35f5f15c98\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " Apr 21 02:50:07.238584 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238455 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-service-ca\") pod \"464ddfa0-b507-4644-b148-9f35f5f15c98\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " Apr 21 02:50:07.238584 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238472 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-oauth-config\") pod \"464ddfa0-b507-4644-b148-9f35f5f15c98\" (UID: \"464ddfa0-b507-4644-b148-9f35f5f15c98\") " Apr 21 02:50:07.238874 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238830 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-console-config" (OuterVolumeSpecName: "console-config") pod "464ddfa0-b507-4644-b148-9f35f5f15c98" (UID: "464ddfa0-b507-4644-b148-9f35f5f15c98"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:50:07.238874 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238841 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "464ddfa0-b507-4644-b148-9f35f5f15c98" (UID: "464ddfa0-b507-4644-b148-9f35f5f15c98"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:50:07.238874 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238861 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "464ddfa0-b507-4644-b148-9f35f5f15c98" (UID: "464ddfa0-b507-4644-b148-9f35f5f15c98"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:50:07.239094 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.238970 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-service-ca" (OuterVolumeSpecName: "service-ca") pod "464ddfa0-b507-4644-b148-9f35f5f15c98" (UID: "464ddfa0-b507-4644-b148-9f35f5f15c98"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:50:07.240645 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.240604 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/464ddfa0-b507-4644-b148-9f35f5f15c98-kube-api-access-hhdm9" (OuterVolumeSpecName: "kube-api-access-hhdm9") pod "464ddfa0-b507-4644-b148-9f35f5f15c98" (UID: "464ddfa0-b507-4644-b148-9f35f5f15c98"). InnerVolumeSpecName "kube-api-access-hhdm9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:50:07.240645 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.240612 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "464ddfa0-b507-4644-b148-9f35f5f15c98" (UID: "464ddfa0-b507-4644-b148-9f35f5f15c98"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:50:07.240766 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.240668 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "464ddfa0-b507-4644-b148-9f35f5f15c98" (UID: "464ddfa0-b507-4644-b148-9f35f5f15c98"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:50:07.339195 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.339171 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hhdm9\" (UniqueName: \"kubernetes.io/projected/464ddfa0-b507-4644-b148-9f35f5f15c98-kube-api-access-hhdm9\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:50:07.339195 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.339194 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-oauth-serving-cert\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:50:07.339320 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.339204 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-console-config\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:50:07.339320 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.339213 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-trusted-ca-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:50:07.339320 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.339223 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-serving-cert\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:50:07.339320 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.339233 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/464ddfa0-b507-4644-b148-9f35f5f15c98-service-ca\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:50:07.339320 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.339241 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/464ddfa0-b507-4644-b148-9f35f5f15c98-console-oauth-config\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:50:07.887534 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.887490 2564 generic.go:358] "Generic (PLEG): container finished" podID="464ddfa0-b507-4644-b148-9f35f5f15c98" containerID="8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c" exitCode=2 Apr 21 02:50:07.887905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.887529 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68954b648f-pdbw4" event={"ID":"464ddfa0-b507-4644-b148-9f35f5f15c98","Type":"ContainerDied","Data":"8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c"} Apr 21 02:50:07.887905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.887563 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68954b648f-pdbw4" Apr 21 02:50:07.887905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.887573 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68954b648f-pdbw4" event={"ID":"464ddfa0-b507-4644-b148-9f35f5f15c98","Type":"ContainerDied","Data":"563840ff3514d232a3a2fdbd250c1e53ae1d70d596255e9f18f9a1f36ff045e1"} Apr 21 02:50:07.887905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.887596 2564 scope.go:117] "RemoveContainer" containerID="8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c" Apr 21 02:50:07.895453 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.895432 2564 scope.go:117] "RemoveContainer" containerID="8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c" Apr 21 02:50:07.895705 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:50:07.895689 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c\": container with ID starting with 8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c not found: ID does not exist" containerID="8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c" Apr 21 02:50:07.895764 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.895713 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c"} err="failed to get container status \"8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c\": rpc error: code = NotFound desc = could not find container \"8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c\": container with ID starting with 8838e11312882952e4361906dc50d4b41c17ca01c39e74a2d5947c188c5d4e2c not found: ID does not exist" Apr 21 02:50:07.918378 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.918354 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68954b648f-pdbw4"] Apr 21 02:50:07.921760 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:07.921741 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68954b648f-pdbw4"] Apr 21 02:50:09.845365 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:09.845329 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="464ddfa0-b507-4644-b148-9f35f5f15c98" path="/var/lib/kubelet/pods/464ddfa0-b507-4644-b148-9f35f5f15c98/volumes" Apr 21 02:50:09.858061 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:50:09.858033 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pj599" Apr 21 02:51:07.891782 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:07.891693 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:51:07.897136 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:07.893461 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:51:11.050366 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.050330 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz"] Apr 21 02:51:11.050879 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.050859 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="464ddfa0-b507-4644-b148-9f35f5f15c98" containerName="console" Apr 21 02:51:11.050968 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.050882 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="464ddfa0-b507-4644-b148-9f35f5f15c98" containerName="console" Apr 21 02:51:11.050968 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.050902 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ad27acb-1236-4d1f-b739-c7e36333d941" containerName="manager" Apr 21 02:51:11.050968 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.050911 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad27acb-1236-4d1f-b739-c7e36333d941" containerName="manager" Apr 21 02:51:11.051119 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.051029 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ad27acb-1236-4d1f-b739-c7e36333d941" containerName="manager" Apr 21 02:51:11.051119 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.051047 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="464ddfa0-b507-4644-b148-9f35f5f15c98" containerName="console" Apr 21 02:51:11.054232 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.054212 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.056783 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.056759 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 02:51:11.056880 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.056854 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ssj29\"" Apr 21 02:51:11.058350 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.058333 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 02:51:11.061752 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.061687 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz"] Apr 21 02:51:11.132264 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.132240 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.132378 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.132277 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.132378 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.132333 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnqdc\" (UniqueName: \"kubernetes.io/projected/5a920f69-6eb9-454c-8875-6d048e5883d4-kube-api-access-bnqdc\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.233223 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.233197 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.233332 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.233236 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.233332 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.233288 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnqdc\" (UniqueName: \"kubernetes.io/projected/5a920f69-6eb9-454c-8875-6d048e5883d4-kube-api-access-bnqdc\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.233590 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.233571 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.233653 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.233636 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.241853 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.241834 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnqdc\" (UniqueName: \"kubernetes.io/projected/5a920f69-6eb9-454c-8875-6d048e5883d4-kube-api-access-bnqdc\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.364476 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.364422 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:11.489377 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:11.489349 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz"] Apr 21 02:51:11.489921 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:51:11.489889 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a920f69_6eb9_454c_8875_6d048e5883d4.slice/crio-83c7cbe4079532daf161208fb7d8151dc666005582590b9afce6084468df1a34 WatchSource:0}: Error finding container 83c7cbe4079532daf161208fb7d8151dc666005582590b9afce6084468df1a34: Status 404 returned error can't find the container with id 83c7cbe4079532daf161208fb7d8151dc666005582590b9afce6084468df1a34 Apr 21 02:51:12.141482 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:12.141443 2564 generic.go:358] "Generic (PLEG): container finished" podID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerID="ad526fd85e78272cc8c68b5834da82ec44469e4d7a62d288d3af47a6717c8ce9" exitCode=0 Apr 21 02:51:12.141853 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:12.141537 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" event={"ID":"5a920f69-6eb9-454c-8875-6d048e5883d4","Type":"ContainerDied","Data":"ad526fd85e78272cc8c68b5834da82ec44469e4d7a62d288d3af47a6717c8ce9"} Apr 21 02:51:12.141853 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:12.141573 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" event={"ID":"5a920f69-6eb9-454c-8875-6d048e5883d4","Type":"ContainerStarted","Data":"83c7cbe4079532daf161208fb7d8151dc666005582590b9afce6084468df1a34"} Apr 21 02:51:13.147393 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:13.147312 2564 generic.go:358] "Generic (PLEG): container finished" podID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerID="24afb93a091d8584b23a53a2f5a219f8606fd590bf89b9019b2a9f30a8c9ba6d" exitCode=0 Apr 21 02:51:13.147828 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:13.147393 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" event={"ID":"5a920f69-6eb9-454c-8875-6d048e5883d4","Type":"ContainerDied","Data":"24afb93a091d8584b23a53a2f5a219f8606fd590bf89b9019b2a9f30a8c9ba6d"} Apr 21 02:51:14.152843 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:14.152810 2564 generic.go:358] "Generic (PLEG): container finished" podID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerID="97d563f29aee26dbb3fea24410e8147f0ea327a89ac9f497cf024f8e1da7886d" exitCode=0 Apr 21 02:51:14.153192 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:14.152894 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" event={"ID":"5a920f69-6eb9-454c-8875-6d048e5883d4","Type":"ContainerDied","Data":"97d563f29aee26dbb3fea24410e8147f0ea327a89ac9f497cf024f8e1da7886d"} Apr 21 02:51:15.282716 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.282692 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:15.364446 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.364421 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-util\") pod \"5a920f69-6eb9-454c-8875-6d048e5883d4\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " Apr 21 02:51:15.364698 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.364518 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnqdc\" (UniqueName: \"kubernetes.io/projected/5a920f69-6eb9-454c-8875-6d048e5883d4-kube-api-access-bnqdc\") pod \"5a920f69-6eb9-454c-8875-6d048e5883d4\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " Apr 21 02:51:15.364698 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.364566 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-bundle\") pod \"5a920f69-6eb9-454c-8875-6d048e5883d4\" (UID: \"5a920f69-6eb9-454c-8875-6d048e5883d4\") " Apr 21 02:51:15.365011 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.364985 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-bundle" (OuterVolumeSpecName: "bundle") pod "5a920f69-6eb9-454c-8875-6d048e5883d4" (UID: "5a920f69-6eb9-454c-8875-6d048e5883d4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:51:15.366569 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.366542 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a920f69-6eb9-454c-8875-6d048e5883d4-kube-api-access-bnqdc" (OuterVolumeSpecName: "kube-api-access-bnqdc") pod "5a920f69-6eb9-454c-8875-6d048e5883d4" (UID: "5a920f69-6eb9-454c-8875-6d048e5883d4"). InnerVolumeSpecName "kube-api-access-bnqdc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:51:15.370040 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.370015 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-util" (OuterVolumeSpecName: "util") pod "5a920f69-6eb9-454c-8875-6d048e5883d4" (UID: "5a920f69-6eb9-454c-8875-6d048e5883d4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:51:15.465992 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.465940 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bnqdc\" (UniqueName: \"kubernetes.io/projected/5a920f69-6eb9-454c-8875-6d048e5883d4-kube-api-access-bnqdc\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:51:15.465992 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.465961 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-bundle\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:51:15.465992 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:15.465971 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a920f69-6eb9-454c-8875-6d048e5883d4-util\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:51:16.163047 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:16.162980 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" Apr 21 02:51:16.163047 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:16.162993 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13505jpnz" event={"ID":"5a920f69-6eb9-454c-8875-6d048e5883d4","Type":"ContainerDied","Data":"83c7cbe4079532daf161208fb7d8151dc666005582590b9afce6084468df1a34"} Apr 21 02:51:16.163047 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:51:16.163021 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c7cbe4079532daf161208fb7d8151dc666005582590b9afce6084468df1a34" Apr 21 02:52:09.858529 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.858479 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-pxsc2"] Apr 21 02:52:09.858970 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.858871 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerName="extract" Apr 21 02:52:09.858970 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.858884 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerName="extract" Apr 21 02:52:09.858970 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.858897 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerName="util" Apr 21 02:52:09.858970 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.858902 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerName="util" Apr 21 02:52:09.858970 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.858918 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerName="pull" Apr 21 02:52:09.858970 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.858927 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerName="pull" Apr 21 02:52:09.859159 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.858992 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a920f69-6eb9-454c-8875-6d048e5883d4" containerName="extract" Apr 21 02:52:09.862010 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.861994 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" Apr 21 02:52:09.865711 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.865689 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-wkhzs\"" Apr 21 02:52:09.872186 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.872162 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-pxsc2"] Apr 21 02:52:09.962803 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.962776 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfrfp\" (UniqueName: \"kubernetes.io/projected/13992349-581c-4dc3-8e54-8ebfe2d136d7-kube-api-access-zfrfp\") pod \"maas-controller-6d4c8f55f9-pxsc2\" (UID: \"13992349-581c-4dc3-8e54-8ebfe2d136d7\") " pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" Apr 21 02:52:09.998135 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:09.998112 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-8986fd4fc-qtw24"] Apr 21 02:52:10.001657 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.001637 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8986fd4fc-qtw24" Apr 21 02:52:10.029292 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.029265 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8986fd4fc-qtw24"] Apr 21 02:52:10.064123 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.064094 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn8dp\" (UniqueName: \"kubernetes.io/projected/9af00ff7-3252-4b84-8c89-4a05979478cc-kube-api-access-pn8dp\") pod \"maas-controller-8986fd4fc-qtw24\" (UID: \"9af00ff7-3252-4b84-8c89-4a05979478cc\") " pod="opendatahub/maas-controller-8986fd4fc-qtw24" Apr 21 02:52:10.064284 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.064180 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfrfp\" (UniqueName: \"kubernetes.io/projected/13992349-581c-4dc3-8e54-8ebfe2d136d7-kube-api-access-zfrfp\") pod \"maas-controller-6d4c8f55f9-pxsc2\" (UID: \"13992349-581c-4dc3-8e54-8ebfe2d136d7\") " pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" Apr 21 02:52:10.073692 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.073665 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfrfp\" (UniqueName: \"kubernetes.io/projected/13992349-581c-4dc3-8e54-8ebfe2d136d7-kube-api-access-zfrfp\") pod \"maas-controller-6d4c8f55f9-pxsc2\" (UID: \"13992349-581c-4dc3-8e54-8ebfe2d136d7\") " pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" Apr 21 02:52:10.121613 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.121542 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-pxsc2"] Apr 21 02:52:10.121832 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.121816 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" Apr 21 02:52:10.148053 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.148021 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5b866894c8-ncndh"] Apr 21 02:52:10.153154 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.153130 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5b866894c8-ncndh" Apr 21 02:52:10.163875 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.163725 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5b866894c8-ncndh"] Apr 21 02:52:10.165247 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.165219 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn8dp\" (UniqueName: \"kubernetes.io/projected/9af00ff7-3252-4b84-8c89-4a05979478cc-kube-api-access-pn8dp\") pod \"maas-controller-8986fd4fc-qtw24\" (UID: \"9af00ff7-3252-4b84-8c89-4a05979478cc\") " pod="opendatahub/maas-controller-8986fd4fc-qtw24" Apr 21 02:52:10.174806 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.174752 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn8dp\" (UniqueName: \"kubernetes.io/projected/9af00ff7-3252-4b84-8c89-4a05979478cc-kube-api-access-pn8dp\") pod \"maas-controller-8986fd4fc-qtw24\" (UID: \"9af00ff7-3252-4b84-8c89-4a05979478cc\") " pod="opendatahub/maas-controller-8986fd4fc-qtw24" Apr 21 02:52:10.266025 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.265995 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v224k\" (UniqueName: \"kubernetes.io/projected/ecce1d5c-4c43-4c22-9dac-22f8f63f117e-kube-api-access-v224k\") pod \"maas-controller-5b866894c8-ncndh\" (UID: \"ecce1d5c-4c43-4c22-9dac-22f8f63f117e\") " pod="opendatahub/maas-controller-5b866894c8-ncndh" Apr 21 02:52:10.277093 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.277059 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-pxsc2"] Apr 21 02:52:10.278374 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:52:10.278327 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13992349_581c_4dc3_8e54_8ebfe2d136d7.slice/crio-11cc3675eb71503f0519ef895cdefdb0f1d12aa786c06596a9a0c6fa46f8d7e8 WatchSource:0}: Error finding container 11cc3675eb71503f0519ef895cdefdb0f1d12aa786c06596a9a0c6fa46f8d7e8: Status 404 returned error can't find the container with id 11cc3675eb71503f0519ef895cdefdb0f1d12aa786c06596a9a0c6fa46f8d7e8 Apr 21 02:52:10.283540 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.282381 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:52:10.311807 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.311789 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8986fd4fc-qtw24" Apr 21 02:52:10.366928 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.366896 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v224k\" (UniqueName: \"kubernetes.io/projected/ecce1d5c-4c43-4c22-9dac-22f8f63f117e-kube-api-access-v224k\") pod \"maas-controller-5b866894c8-ncndh\" (UID: \"ecce1d5c-4c43-4c22-9dac-22f8f63f117e\") " pod="opendatahub/maas-controller-5b866894c8-ncndh" Apr 21 02:52:10.375756 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.375703 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v224k\" (UniqueName: \"kubernetes.io/projected/ecce1d5c-4c43-4c22-9dac-22f8f63f117e-kube-api-access-v224k\") pod \"maas-controller-5b866894c8-ncndh\" (UID: \"ecce1d5c-4c43-4c22-9dac-22f8f63f117e\") " pod="opendatahub/maas-controller-5b866894c8-ncndh" Apr 21 02:52:10.388443 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.388409 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" event={"ID":"13992349-581c-4dc3-8e54-8ebfe2d136d7","Type":"ContainerStarted","Data":"11cc3675eb71503f0519ef895cdefdb0f1d12aa786c06596a9a0c6fa46f8d7e8"} Apr 21 02:52:10.442858 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.439095 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8986fd4fc-qtw24"] Apr 21 02:52:10.468619 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.468594 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5b866894c8-ncndh" Apr 21 02:52:10.801858 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:10.801831 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5b866894c8-ncndh"] Apr 21 02:52:10.804134 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:52:10.804097 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecce1d5c_4c43_4c22_9dac_22f8f63f117e.slice/crio-86bb32a85fb7320a1550846b91a2057da58f6815753d2badd72757817b78b234 WatchSource:0}: Error finding container 86bb32a85fb7320a1550846b91a2057da58f6815753d2badd72757817b78b234: Status 404 returned error can't find the container with id 86bb32a85fb7320a1550846b91a2057da58f6815753d2badd72757817b78b234 Apr 21 02:52:11.399077 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:11.399038 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5b866894c8-ncndh" event={"ID":"ecce1d5c-4c43-4c22-9dac-22f8f63f117e","Type":"ContainerStarted","Data":"86bb32a85fb7320a1550846b91a2057da58f6815753d2badd72757817b78b234"} Apr 21 02:52:11.401161 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:11.401127 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8986fd4fc-qtw24" event={"ID":"9af00ff7-3252-4b84-8c89-4a05979478cc","Type":"ContainerStarted","Data":"0b9c57b567e446c5b10c5b2453498ed5b97d4fe1b23d45ec28e63602c4e246eb"} Apr 21 02:52:14.416229 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.416189 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5b866894c8-ncndh" event={"ID":"ecce1d5c-4c43-4c22-9dac-22f8f63f117e","Type":"ContainerStarted","Data":"1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7"} Apr 21 02:52:14.416698 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.416342 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5b866894c8-ncndh" Apr 21 02:52:14.417700 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.417671 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8986fd4fc-qtw24" event={"ID":"9af00ff7-3252-4b84-8c89-4a05979478cc","Type":"ContainerStarted","Data":"9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca"} Apr 21 02:52:14.417821 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.417719 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-8986fd4fc-qtw24" Apr 21 02:52:14.418932 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.418915 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" event={"ID":"13992349-581c-4dc3-8e54-8ebfe2d136d7","Type":"ContainerStarted","Data":"3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99"} Apr 21 02:52:14.419014 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.418996 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" podUID="13992349-581c-4dc3-8e54-8ebfe2d136d7" containerName="manager" containerID="cri-o://3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99" gracePeriod=10 Apr 21 02:52:14.419066 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.419054 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" Apr 21 02:52:14.432947 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.432899 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5b866894c8-ncndh" podStartSLOduration=1.675042688 podStartE2EDuration="4.432885072s" podCreationTimestamp="2026-04-21 02:52:10 +0000 UTC" firstStartedPulling="2026-04-21 02:52:10.805892251 +0000 UTC m=+663.573380918" lastFinishedPulling="2026-04-21 02:52:13.563734632 +0000 UTC m=+666.331223302" observedRunningTime="2026-04-21 02:52:14.431430553 +0000 UTC m=+667.198919243" watchObservedRunningTime="2026-04-21 02:52:14.432885072 +0000 UTC m=+667.200373761" Apr 21 02:52:14.448946 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.448747 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" podStartSLOduration=2.167863675 podStartE2EDuration="5.448729993s" podCreationTimestamp="2026-04-21 02:52:09 +0000 UTC" firstStartedPulling="2026-04-21 02:52:10.282677035 +0000 UTC m=+663.050165706" lastFinishedPulling="2026-04-21 02:52:13.563543348 +0000 UTC m=+666.331032024" observedRunningTime="2026-04-21 02:52:14.44781699 +0000 UTC m=+667.215305679" watchObservedRunningTime="2026-04-21 02:52:14.448729993 +0000 UTC m=+667.216218683" Apr 21 02:52:14.469918 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.469876 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-8986fd4fc-qtw24" podStartSLOduration=2.3428445079999998 podStartE2EDuration="5.469864705s" podCreationTimestamp="2026-04-21 02:52:09 +0000 UTC" firstStartedPulling="2026-04-21 02:52:10.443470358 +0000 UTC m=+663.210959027" lastFinishedPulling="2026-04-21 02:52:13.570490553 +0000 UTC m=+666.337979224" observedRunningTime="2026-04-21 02:52:14.468047608 +0000 UTC m=+667.235536297" watchObservedRunningTime="2026-04-21 02:52:14.469864705 +0000 UTC m=+667.237353395" Apr 21 02:52:14.663172 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.663152 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" Apr 21 02:52:14.708365 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.708301 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfrfp\" (UniqueName: \"kubernetes.io/projected/13992349-581c-4dc3-8e54-8ebfe2d136d7-kube-api-access-zfrfp\") pod \"13992349-581c-4dc3-8e54-8ebfe2d136d7\" (UID: \"13992349-581c-4dc3-8e54-8ebfe2d136d7\") " Apr 21 02:52:14.710337 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.710308 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13992349-581c-4dc3-8e54-8ebfe2d136d7-kube-api-access-zfrfp" (OuterVolumeSpecName: "kube-api-access-zfrfp") pod "13992349-581c-4dc3-8e54-8ebfe2d136d7" (UID: "13992349-581c-4dc3-8e54-8ebfe2d136d7"). InnerVolumeSpecName "kube-api-access-zfrfp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:52:14.808860 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:14.808839 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zfrfp\" (UniqueName: \"kubernetes.io/projected/13992349-581c-4dc3-8e54-8ebfe2d136d7-kube-api-access-zfrfp\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:52:15.423715 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.423683 2564 generic.go:358] "Generic (PLEG): container finished" podID="13992349-581c-4dc3-8e54-8ebfe2d136d7" containerID="3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99" exitCode=0 Apr 21 02:52:15.424201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.423750 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" Apr 21 02:52:15.424201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.423767 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" event={"ID":"13992349-581c-4dc3-8e54-8ebfe2d136d7","Type":"ContainerDied","Data":"3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99"} Apr 21 02:52:15.424201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.423805 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-pxsc2" event={"ID":"13992349-581c-4dc3-8e54-8ebfe2d136d7","Type":"ContainerDied","Data":"11cc3675eb71503f0519ef895cdefdb0f1d12aa786c06596a9a0c6fa46f8d7e8"} Apr 21 02:52:15.424201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.423826 2564 scope.go:117] "RemoveContainer" containerID="3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99" Apr 21 02:52:15.434136 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.434119 2564 scope.go:117] "RemoveContainer" containerID="3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99" Apr 21 02:52:15.434380 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:52:15.434362 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99\": container with ID starting with 3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99 not found: ID does not exist" containerID="3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99" Apr 21 02:52:15.434440 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.434387 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99"} err="failed to get container status \"3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99\": rpc error: code = NotFound desc = could not find container \"3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99\": container with ID starting with 3b3285aedca23a794d4f31cef0bba686ff921bc63c72759135e696b298e49c99 not found: ID does not exist" Apr 21 02:52:15.446728 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.446707 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-pxsc2"] Apr 21 02:52:15.450488 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.450468 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-pxsc2"] Apr 21 02:52:15.847314 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:15.847276 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13992349-581c-4dc3-8e54-8ebfe2d136d7" path="/var/lib/kubelet/pods/13992349-581c-4dc3-8e54-8ebfe2d136d7/volumes" Apr 21 02:52:25.429599 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.429569 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-8986fd4fc-qtw24" Apr 21 02:52:25.430052 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.429948 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5b866894c8-ncndh" Apr 21 02:52:25.484802 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.484773 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8986fd4fc-qtw24"] Apr 21 02:52:25.484972 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.484949 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-8986fd4fc-qtw24" podUID="9af00ff7-3252-4b84-8c89-4a05979478cc" containerName="manager" containerID="cri-o://9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca" gracePeriod=10 Apr 21 02:52:25.716428 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.716409 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8986fd4fc-qtw24" Apr 21 02:52:25.756917 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.756883 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-fbc547765-wdfkp"] Apr 21 02:52:25.757312 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.757292 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13992349-581c-4dc3-8e54-8ebfe2d136d7" containerName="manager" Apr 21 02:52:25.757312 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.757309 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="13992349-581c-4dc3-8e54-8ebfe2d136d7" containerName="manager" Apr 21 02:52:25.757589 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.757319 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9af00ff7-3252-4b84-8c89-4a05979478cc" containerName="manager" Apr 21 02:52:25.757589 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.757324 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af00ff7-3252-4b84-8c89-4a05979478cc" containerName="manager" Apr 21 02:52:25.757589 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.757449 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9af00ff7-3252-4b84-8c89-4a05979478cc" containerName="manager" Apr 21 02:52:25.757589 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.757464 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="13992349-581c-4dc3-8e54-8ebfe2d136d7" containerName="manager" Apr 21 02:52:25.760655 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.760640 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-fbc547765-wdfkp" Apr 21 02:52:25.768011 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.767984 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-fbc547765-wdfkp"] Apr 21 02:52:25.801296 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.801276 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn8dp\" (UniqueName: \"kubernetes.io/projected/9af00ff7-3252-4b84-8c89-4a05979478cc-kube-api-access-pn8dp\") pod \"9af00ff7-3252-4b84-8c89-4a05979478cc\" (UID: \"9af00ff7-3252-4b84-8c89-4a05979478cc\") " Apr 21 02:52:25.803239 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.803205 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af00ff7-3252-4b84-8c89-4a05979478cc-kube-api-access-pn8dp" (OuterVolumeSpecName: "kube-api-access-pn8dp") pod "9af00ff7-3252-4b84-8c89-4a05979478cc" (UID: "9af00ff7-3252-4b84-8c89-4a05979478cc"). InnerVolumeSpecName "kube-api-access-pn8dp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:52:25.901863 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.901837 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dkvh\" (UniqueName: \"kubernetes.io/projected/bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb-kube-api-access-8dkvh\") pod \"maas-controller-fbc547765-wdfkp\" (UID: \"bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb\") " pod="opendatahub/maas-controller-fbc547765-wdfkp" Apr 21 02:52:25.902095 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:25.902063 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pn8dp\" (UniqueName: \"kubernetes.io/projected/9af00ff7-3252-4b84-8c89-4a05979478cc-kube-api-access-pn8dp\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:52:26.003114 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.003090 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dkvh\" (UniqueName: \"kubernetes.io/projected/bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb-kube-api-access-8dkvh\") pod \"maas-controller-fbc547765-wdfkp\" (UID: \"bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb\") " pod="opendatahub/maas-controller-fbc547765-wdfkp" Apr 21 02:52:26.011706 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.011687 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dkvh\" (UniqueName: \"kubernetes.io/projected/bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb-kube-api-access-8dkvh\") pod \"maas-controller-fbc547765-wdfkp\" (UID: \"bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb\") " pod="opendatahub/maas-controller-fbc547765-wdfkp" Apr 21 02:52:26.071859 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.071841 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-fbc547765-wdfkp" Apr 21 02:52:26.213980 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.213957 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-fbc547765-wdfkp"] Apr 21 02:52:26.216118 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:52:26.216089 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcd6fd77_71fe_4c75_bc81_f8ee3d33e5eb.slice/crio-9f689cb646f5e821817289081ecad6150c8d3630be29c5e305d49ae11088840d WatchSource:0}: Error finding container 9f689cb646f5e821817289081ecad6150c8d3630be29c5e305d49ae11088840d: Status 404 returned error can't find the container with id 9f689cb646f5e821817289081ecad6150c8d3630be29c5e305d49ae11088840d Apr 21 02:52:26.471599 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.471521 2564 generic.go:358] "Generic (PLEG): container finished" podID="9af00ff7-3252-4b84-8c89-4a05979478cc" containerID="9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca" exitCode=0 Apr 21 02:52:26.471599 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.471573 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8986fd4fc-qtw24" Apr 21 02:52:26.471599 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.471577 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8986fd4fc-qtw24" event={"ID":"9af00ff7-3252-4b84-8c89-4a05979478cc","Type":"ContainerDied","Data":"9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca"} Apr 21 02:52:26.472146 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.471624 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8986fd4fc-qtw24" event={"ID":"9af00ff7-3252-4b84-8c89-4a05979478cc","Type":"ContainerDied","Data":"0b9c57b567e446c5b10c5b2453498ed5b97d4fe1b23d45ec28e63602c4e246eb"} Apr 21 02:52:26.472146 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.471646 2564 scope.go:117] "RemoveContainer" containerID="9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca" Apr 21 02:52:26.472837 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.472812 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-fbc547765-wdfkp" event={"ID":"bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb","Type":"ContainerStarted","Data":"9f689cb646f5e821817289081ecad6150c8d3630be29c5e305d49ae11088840d"} Apr 21 02:52:26.480399 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.480383 2564 scope.go:117] "RemoveContainer" containerID="9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca" Apr 21 02:52:26.480672 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:52:26.480642 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca\": container with ID starting with 9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca not found: ID does not exist" containerID="9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca" Apr 21 02:52:26.480768 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.480676 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca"} err="failed to get container status \"9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca\": rpc error: code = NotFound desc = could not find container \"9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca\": container with ID starting with 9c7e07a5e17954198b4b31eb2cbc5e1be1c9c20b514735954d1875f95d8787ca not found: ID does not exist" Apr 21 02:52:26.489020 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.488998 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8986fd4fc-qtw24"] Apr 21 02:52:26.492418 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:26.492398 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-8986fd4fc-qtw24"] Apr 21 02:52:27.479315 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:27.479283 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-fbc547765-wdfkp" event={"ID":"bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb","Type":"ContainerStarted","Data":"b4477de54e2e970ab5c15ce69bff5280543d4d299b88552ad72d48e421f556a3"} Apr 21 02:52:27.479713 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:27.479454 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-fbc547765-wdfkp" Apr 21 02:52:27.496628 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:27.496587 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-fbc547765-wdfkp" podStartSLOduration=2.150313643 podStartE2EDuration="2.496575887s" podCreationTimestamp="2026-04-21 02:52:25 +0000 UTC" firstStartedPulling="2026-04-21 02:52:26.217800035 +0000 UTC m=+678.985288722" lastFinishedPulling="2026-04-21 02:52:26.564062291 +0000 UTC m=+679.331550966" observedRunningTime="2026-04-21 02:52:27.493937234 +0000 UTC m=+680.261425922" watchObservedRunningTime="2026-04-21 02:52:27.496575887 +0000 UTC m=+680.264064575" Apr 21 02:52:27.845527 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:27.845483 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af00ff7-3252-4b84-8c89-4a05979478cc" path="/var/lib/kubelet/pods/9af00ff7-3252-4b84-8c89-4a05979478cc/volumes" Apr 21 02:52:38.489073 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:38.489042 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-fbc547765-wdfkp" Apr 21 02:52:38.543609 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:38.543577 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5b866894c8-ncndh"] Apr 21 02:52:38.543902 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:38.543852 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5b866894c8-ncndh" podUID="ecce1d5c-4c43-4c22-9dac-22f8f63f117e" containerName="manager" containerID="cri-o://1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7" gracePeriod=10 Apr 21 02:52:38.782466 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:38.782446 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5b866894c8-ncndh" Apr 21 02:52:38.906371 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:38.906343 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v224k\" (UniqueName: \"kubernetes.io/projected/ecce1d5c-4c43-4c22-9dac-22f8f63f117e-kube-api-access-v224k\") pod \"ecce1d5c-4c43-4c22-9dac-22f8f63f117e\" (UID: \"ecce1d5c-4c43-4c22-9dac-22f8f63f117e\") " Apr 21 02:52:38.908362 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:38.908333 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecce1d5c-4c43-4c22-9dac-22f8f63f117e-kube-api-access-v224k" (OuterVolumeSpecName: "kube-api-access-v224k") pod "ecce1d5c-4c43-4c22-9dac-22f8f63f117e" (UID: "ecce1d5c-4c43-4c22-9dac-22f8f63f117e"). InnerVolumeSpecName "kube-api-access-v224k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:52:39.007739 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.007717 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v224k\" (UniqueName: \"kubernetes.io/projected/ecce1d5c-4c43-4c22-9dac-22f8f63f117e-kube-api-access-v224k\") on node \"ip-10-0-131-170.ec2.internal\" DevicePath \"\"" Apr 21 02:52:39.528333 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.528299 2564 generic.go:358] "Generic (PLEG): container finished" podID="ecce1d5c-4c43-4c22-9dac-22f8f63f117e" containerID="1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7" exitCode=0 Apr 21 02:52:39.528783 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.528359 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5b866894c8-ncndh" event={"ID":"ecce1d5c-4c43-4c22-9dac-22f8f63f117e","Type":"ContainerDied","Data":"1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7"} Apr 21 02:52:39.528783 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.528381 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5b866894c8-ncndh" Apr 21 02:52:39.528783 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.528394 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5b866894c8-ncndh" event={"ID":"ecce1d5c-4c43-4c22-9dac-22f8f63f117e","Type":"ContainerDied","Data":"86bb32a85fb7320a1550846b91a2057da58f6815753d2badd72757817b78b234"} Apr 21 02:52:39.528783 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.528416 2564 scope.go:117] "RemoveContainer" containerID="1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7" Apr 21 02:52:39.537299 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.537284 2564 scope.go:117] "RemoveContainer" containerID="1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7" Apr 21 02:52:39.537588 ip-10-0-131-170 kubenswrapper[2564]: E0421 02:52:39.537564 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7\": container with ID starting with 1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7 not found: ID does not exist" containerID="1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7" Apr 21 02:52:39.537662 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.537597 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7"} err="failed to get container status \"1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7\": rpc error: code = NotFound desc = could not find container \"1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7\": container with ID starting with 1bd744a80e297865a4e9216a91717b88ae12627e228232052cb94556309e21f7 not found: ID does not exist" Apr 21 02:52:39.551514 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.551471 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5b866894c8-ncndh"] Apr 21 02:52:39.553828 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.553809 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5b866894c8-ncndh"] Apr 21 02:52:39.846069 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:39.845999 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecce1d5c-4c43-4c22-9dac-22f8f63f117e" path="/var/lib/kubelet/pods/ecce1d5c-4c43-4c22-9dac-22f8f63f117e/volumes" Apr 21 02:52:58.096891 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.096822 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-ff9579c68-hb99r"] Apr 21 02:52:58.097307 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.097258 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecce1d5c-4c43-4c22-9dac-22f8f63f117e" containerName="manager" Apr 21 02:52:58.097307 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.097271 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecce1d5c-4c43-4c22-9dac-22f8f63f117e" containerName="manager" Apr 21 02:52:58.097382 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.097338 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecce1d5c-4c43-4c22-9dac-22f8f63f117e" containerName="manager" Apr 21 02:52:58.100466 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.100451 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:52:58.103049 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.103029 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 02:52:58.103146 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.103090 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 02:52:58.104133 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.104110 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6jkpt\"" Apr 21 02:52:58.108212 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.108140 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-ff9579c68-hb99r"] Apr 21 02:52:58.158751 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.158728 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0dbabaa1-121a-41fd-ae64-6aa58e065163-maas-api-tls\") pod \"maas-api-ff9579c68-hb99r\" (UID: \"0dbabaa1-121a-41fd-ae64-6aa58e065163\") " pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:52:58.158883 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.158779 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22whb\" (UniqueName: \"kubernetes.io/projected/0dbabaa1-121a-41fd-ae64-6aa58e065163-kube-api-access-22whb\") pod \"maas-api-ff9579c68-hb99r\" (UID: \"0dbabaa1-121a-41fd-ae64-6aa58e065163\") " pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:52:58.259415 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.259386 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0dbabaa1-121a-41fd-ae64-6aa58e065163-maas-api-tls\") pod \"maas-api-ff9579c68-hb99r\" (UID: \"0dbabaa1-121a-41fd-ae64-6aa58e065163\") " pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:52:58.259572 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.259431 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22whb\" (UniqueName: \"kubernetes.io/projected/0dbabaa1-121a-41fd-ae64-6aa58e065163-kube-api-access-22whb\") pod \"maas-api-ff9579c68-hb99r\" (UID: \"0dbabaa1-121a-41fd-ae64-6aa58e065163\") " pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:52:58.261797 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.261773 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0dbabaa1-121a-41fd-ae64-6aa58e065163-maas-api-tls\") pod \"maas-api-ff9579c68-hb99r\" (UID: \"0dbabaa1-121a-41fd-ae64-6aa58e065163\") " pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:52:58.267598 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.267576 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22whb\" (UniqueName: \"kubernetes.io/projected/0dbabaa1-121a-41fd-ae64-6aa58e065163-kube-api-access-22whb\") pod \"maas-api-ff9579c68-hb99r\" (UID: \"0dbabaa1-121a-41fd-ae64-6aa58e065163\") " pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:52:58.411812 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.411744 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:52:58.536408 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.536381 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-ff9579c68-hb99r"] Apr 21 02:52:58.537085 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:52:58.537057 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dbabaa1_121a_41fd_ae64_6aa58e065163.slice/crio-c4ab8ea9d7a05d2fc4503da191b6e94ee3d892b7f294a811597f50e93f86887b WatchSource:0}: Error finding container c4ab8ea9d7a05d2fc4503da191b6e94ee3d892b7f294a811597f50e93f86887b: Status 404 returned error can't find the container with id c4ab8ea9d7a05d2fc4503da191b6e94ee3d892b7f294a811597f50e93f86887b Apr 21 02:52:58.614175 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:52:58.614140 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-ff9579c68-hb99r" event={"ID":"0dbabaa1-121a-41fd-ae64-6aa58e065163","Type":"ContainerStarted","Data":"c4ab8ea9d7a05d2fc4503da191b6e94ee3d892b7f294a811597f50e93f86887b"} Apr 21 02:53:00.623842 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:00.623809 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-ff9579c68-hb99r" event={"ID":"0dbabaa1-121a-41fd-ae64-6aa58e065163","Type":"ContainerStarted","Data":"ebe6ee2dc499321a6d54a9d05a0e4ca02b5a686051c87c75f5cc30fb4e5d5278"} Apr 21 02:53:00.624275 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:00.623867 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:53:00.642168 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:00.642122 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-ff9579c68-hb99r" podStartSLOduration=1.1517976619999999 podStartE2EDuration="2.642107127s" podCreationTimestamp="2026-04-21 02:52:58 +0000 UTC" firstStartedPulling="2026-04-21 02:52:58.538324752 +0000 UTC m=+711.305813420" lastFinishedPulling="2026-04-21 02:53:00.028634218 +0000 UTC m=+712.796122885" observedRunningTime="2026-04-21 02:53:00.640159915 +0000 UTC m=+713.407648606" watchObservedRunningTime="2026-04-21 02:53:00.642107127 +0000 UTC m=+713.409595815" Apr 21 02:53:06.634535 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:06.634485 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-ff9579c68-hb99r" Apr 21 02:53:29.795099 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.795067 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628"] Apr 21 02:53:29.799282 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.799259 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:29.802809 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.802788 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-275zm\"" Apr 21 02:53:29.802941 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.802818 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 02:53:29.802941 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.802845 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 02:53:29.802941 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.802788 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 21 02:53:29.808953 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.808935 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628"] Apr 21 02:53:29.905345 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.905313 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:29.905491 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.905353 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:29.905491 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.905449 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:29.905596 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.905530 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zb2x\" (UniqueName: \"kubernetes.io/projected/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-kube-api-access-5zb2x\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:29.905596 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.905583 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:29.905691 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:29.905674 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.007052 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.007015 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.007200 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.007058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zb2x\" (UniqueName: \"kubernetes.io/projected/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-kube-api-access-5zb2x\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.007200 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.007105 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.007200 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.007140 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.007360 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.007205 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.007360 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.007245 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.007609 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.007586 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.007838 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.007814 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.007945 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.007868 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.009455 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.009434 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.009769 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.009752 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.014905 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.014886 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zb2x\" (UniqueName: \"kubernetes.io/projected/86175c9f-1e2d-4016-9daa-ca9c1aa97e0a-kube-api-access-5zb2x\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xb628\" (UID: \"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.111795 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.111740 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:30.242431 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.242404 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628"] Apr 21 02:53:30.244380 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:53:30.244353 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86175c9f_1e2d_4016_9daa_ca9c1aa97e0a.slice/crio-6614000e1a330b0e8549d475fe504eae03dc479fb105dcd4eea42be301799f78 WatchSource:0}: Error finding container 6614000e1a330b0e8549d475fe504eae03dc479fb105dcd4eea42be301799f78: Status 404 returned error can't find the container with id 6614000e1a330b0e8549d475fe504eae03dc479fb105dcd4eea42be301799f78 Apr 21 02:53:30.747377 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:30.747341 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" event={"ID":"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a","Type":"ContainerStarted","Data":"6614000e1a330b0e8549d475fe504eae03dc479fb105dcd4eea42be301799f78"} Apr 21 02:53:34.773295 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.773264 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs"] Apr 21 02:53:34.793531 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.793467 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs"] Apr 21 02:53:34.793693 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.793617 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.796088 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.796061 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 21 02:53:34.853433 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.853400 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blsgq\" (UniqueName: \"kubernetes.io/projected/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-kube-api-access-blsgq\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.853625 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.853553 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.853625 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.853616 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.854332 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.853677 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.854332 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.853734 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.854332 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.853779 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.954932 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.954895 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.955120 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.955071 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.955187 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.955135 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.955187 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.955165 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.955299 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.955224 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blsgq\" (UniqueName: \"kubernetes.io/projected/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-kube-api-access-blsgq\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.955363 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.955304 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.955459 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.955436 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.955738 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.955662 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.955848 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.955769 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.957449 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.957424 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.958000 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.957982 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:34.962817 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:34.962793 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blsgq\" (UniqueName: \"kubernetes.io/projected/56dd6b67-08fc-414f-9f74-a649c1dbbcd4-kube-api-access-blsgq\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs\" (UID: \"56dd6b67-08fc-414f-9f74-a649c1dbbcd4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:35.108087 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:35.108008 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:36.621356 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:36.621331 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs"] Apr 21 02:53:36.622569 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:53:36.622542 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56dd6b67_08fc_414f_9f74_a649c1dbbcd4.slice/crio-dbdd13597ce978defe760f22d9e62dc2423ac924e651d15e1008986f3fc8c5d2 WatchSource:0}: Error finding container dbdd13597ce978defe760f22d9e62dc2423ac924e651d15e1008986f3fc8c5d2: Status 404 returned error can't find the container with id dbdd13597ce978defe760f22d9e62dc2423ac924e651d15e1008986f3fc8c5d2 Apr 21 02:53:36.777068 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:36.776972 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" event={"ID":"56dd6b67-08fc-414f-9f74-a649c1dbbcd4","Type":"ContainerStarted","Data":"8204b5790d074bc3a3cbbbe6cc7a88556ec54094ce0430f1a78c4f8dcd0f478f"} Apr 21 02:53:36.777068 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:36.777016 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" event={"ID":"56dd6b67-08fc-414f-9f74-a649c1dbbcd4","Type":"ContainerStarted","Data":"dbdd13597ce978defe760f22d9e62dc2423ac924e651d15e1008986f3fc8c5d2"} Apr 21 02:53:36.778789 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:36.778762 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" event={"ID":"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a","Type":"ContainerStarted","Data":"d0e1b001682d1d51da40de54160251b5f141a006d08155909bc459254382a65a"} Apr 21 02:53:42.808846 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:42.808811 2564 generic.go:358] "Generic (PLEG): container finished" podID="56dd6b67-08fc-414f-9f74-a649c1dbbcd4" containerID="8204b5790d074bc3a3cbbbe6cc7a88556ec54094ce0430f1a78c4f8dcd0f478f" exitCode=0 Apr 21 02:53:42.809274 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:42.808886 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" event={"ID":"56dd6b67-08fc-414f-9f74-a649c1dbbcd4","Type":"ContainerDied","Data":"8204b5790d074bc3a3cbbbe6cc7a88556ec54094ce0430f1a78c4f8dcd0f478f"} Apr 21 02:53:44.820444 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:44.820411 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" event={"ID":"56dd6b67-08fc-414f-9f74-a649c1dbbcd4","Type":"ContainerStarted","Data":"8d53dc67c9ef5579dd597289e8c577fe34596041580ab3cca4991ebf10e2e89b"} Apr 21 02:53:44.820890 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:44.820670 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:53:44.821849 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:44.821823 2564 generic.go:358] "Generic (PLEG): container finished" podID="86175c9f-1e2d-4016-9daa-ca9c1aa97e0a" containerID="d0e1b001682d1d51da40de54160251b5f141a006d08155909bc459254382a65a" exitCode=0 Apr 21 02:53:44.821938 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:44.821895 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" event={"ID":"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a","Type":"ContainerDied","Data":"d0e1b001682d1d51da40de54160251b5f141a006d08155909bc459254382a65a"} Apr 21 02:53:44.840351 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:44.840312 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" podStartSLOduration=9.670074836 podStartE2EDuration="10.840301221s" podCreationTimestamp="2026-04-21 02:53:34 +0000 UTC" firstStartedPulling="2026-04-21 02:53:42.809534031 +0000 UTC m=+755.577022697" lastFinishedPulling="2026-04-21 02:53:43.979760414 +0000 UTC m=+756.747249082" observedRunningTime="2026-04-21 02:53:44.836230102 +0000 UTC m=+757.603718790" watchObservedRunningTime="2026-04-21 02:53:44.840301221 +0000 UTC m=+757.607789910" Apr 21 02:53:45.827672 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:45.827630 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" event={"ID":"86175c9f-1e2d-4016-9daa-ca9c1aa97e0a","Type":"ContainerStarted","Data":"b131841a4c3a4ba071f2154d8029c53c96f600b99386910df14970ac7f3f900a"} Apr 21 02:53:45.848415 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:45.848365 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" podStartSLOduration=2.101333696 podStartE2EDuration="16.848351384s" podCreationTimestamp="2026-04-21 02:53:29 +0000 UTC" firstStartedPulling="2026-04-21 02:53:30.246147067 +0000 UTC m=+743.013635737" lastFinishedPulling="2026-04-21 02:53:44.993164753 +0000 UTC m=+757.760653425" observedRunningTime="2026-04-21 02:53:45.846692331 +0000 UTC m=+758.614181021" watchObservedRunningTime="2026-04-21 02:53:45.848351384 +0000 UTC m=+758.615840125" Apr 21 02:53:55.828586 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:55.828548 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:55.851979 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:55.851951 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xb628" Apr 21 02:53:55.852154 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:53:55.852006 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs" Apr 21 02:54:37.569381 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.569302 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9"] Apr 21 02:54:37.572890 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.572874 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.575141 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.575117 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 21 02:54:37.583620 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.583600 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9"] Apr 21 02:54:37.686991 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.686960 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.687138 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.687004 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.687138 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.687036 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.687138 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.687074 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.687138 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.687120 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.687294 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.687143 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdgw\" (UniqueName: \"kubernetes.io/projected/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-kube-api-access-bqdgw\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.788399 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.788370 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.788583 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.788411 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.788583 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.788441 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.788583 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.788559 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.788759 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.788652 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.788759 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.788693 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdgw\" (UniqueName: \"kubernetes.io/projected/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-kube-api-access-bqdgw\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.788868 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.788838 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.788989 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.788965 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.789139 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.789109 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.790896 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.790873 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.791152 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.791132 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.796241 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.796215 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdgw\" (UniqueName: \"kubernetes.io/projected/af54e6da-777e-4c7b-9cd2-4b555fc0bb4e-kube-api-access-bqdgw\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9\" (UID: \"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:37.883673 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:37.883605 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:38.012707 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:38.012681 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9"] Apr 21 02:54:38.013664 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:54:38.013632 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf54e6da_777e_4c7b_9cd2_4b555fc0bb4e.slice/crio-ee7e429dfde0d35dea0e082e6f0fc044432a36a27f0d49a8c56950e8bed35493 WatchSource:0}: Error finding container ee7e429dfde0d35dea0e082e6f0fc044432a36a27f0d49a8c56950e8bed35493: Status 404 returned error can't find the container with id ee7e429dfde0d35dea0e082e6f0fc044432a36a27f0d49a8c56950e8bed35493 Apr 21 02:54:38.035456 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:38.035431 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" event={"ID":"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e","Type":"ContainerStarted","Data":"ee7e429dfde0d35dea0e082e6f0fc044432a36a27f0d49a8c56950e8bed35493"} Apr 21 02:54:39.041353 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:39.041318 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" event={"ID":"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e","Type":"ContainerStarted","Data":"6a997b056698da2701683b58c8430e31d9451ef28c936f0eeeb0a93f5585af86"} Apr 21 02:54:44.063202 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:44.063163 2564 generic.go:358] "Generic (PLEG): container finished" podID="af54e6da-777e-4c7b-9cd2-4b555fc0bb4e" containerID="6a997b056698da2701683b58c8430e31d9451ef28c936f0eeeb0a93f5585af86" exitCode=0 Apr 21 02:54:44.063594 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:44.063233 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" event={"ID":"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e","Type":"ContainerDied","Data":"6a997b056698da2701683b58c8430e31d9451ef28c936f0eeeb0a93f5585af86"} Apr 21 02:54:45.068891 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:45.068853 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" event={"ID":"af54e6da-777e-4c7b-9cd2-4b555fc0bb4e","Type":"ContainerStarted","Data":"c54f37bcbcc089455d2094dda2836725fd6967052c9e71cea435650de3ec5fb6"} Apr 21 02:54:45.069349 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:45.069255 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:54:45.086255 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:45.086210 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" podStartSLOduration=7.91230632 podStartE2EDuration="8.086194626s" podCreationTimestamp="2026-04-21 02:54:37 +0000 UTC" firstStartedPulling="2026-04-21 02:54:44.063861409 +0000 UTC m=+816.831350076" lastFinishedPulling="2026-04-21 02:54:44.237749712 +0000 UTC m=+817.005238382" observedRunningTime="2026-04-21 02:54:45.084811425 +0000 UTC m=+817.852300151" watchObservedRunningTime="2026-04-21 02:54:45.086194626 +0000 UTC m=+817.853683314" Apr 21 02:54:56.085950 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:54:56.085918 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9" Apr 21 02:55:43.751181 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:43.751152 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-ff9579c68-hb99r_0dbabaa1-121a-41fd-ae64-6aa58e065163/maas-api/0.log" Apr 21 02:55:43.857898 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:43.857874 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-fbc547765-wdfkp_bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb/manager/0.log" Apr 21 02:55:44.077993 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:44.077926 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f4d6bff-5ppcc_d3385a01-1bf2-49ca-b232-f651e0f598f1/manager/0.log" Apr 21 02:55:45.240344 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.240309 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d_6b9875cd-7a08-475a-aee0-931f0e0008f4/util/0.log" Apr 21 02:55:45.251025 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.251005 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d_6b9875cd-7a08-475a-aee0-931f0e0008f4/pull/0.log" Apr 21 02:55:45.256693 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.256676 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d_6b9875cd-7a08-475a-aee0-931f0e0008f4/extract/0.log" Apr 21 02:55:45.377726 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.377704 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd_e43cc539-5833-43d0-a560-b80fc0e4fa0c/util/0.log" Apr 21 02:55:45.384742 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.384721 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd_e43cc539-5833-43d0-a560-b80fc0e4fa0c/pull/0.log" Apr 21 02:55:45.391227 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.391206 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd_e43cc539-5833-43d0-a560-b80fc0e4fa0c/extract/0.log" Apr 21 02:55:45.505160 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.505128 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw_733ca859-ae1f-43e6-9b40-b2ff9828431a/util/0.log" Apr 21 02:55:45.511013 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.510994 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw_733ca859-ae1f-43e6-9b40-b2ff9828431a/pull/0.log" Apr 21 02:55:45.516448 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.516432 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw_733ca859-ae1f-43e6-9b40-b2ff9828431a/extract/0.log" Apr 21 02:55:45.638239 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.638213 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz_050bdd0b-9b38-4465-a197-62966aab158c/util/0.log" Apr 21 02:55:45.644240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.644223 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz_050bdd0b-9b38-4465-a197-62966aab158c/pull/0.log" Apr 21 02:55:45.659695 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.659668 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz_050bdd0b-9b38-4465-a197-62966aab158c/extract/0.log" Apr 21 02:55:45.894638 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:45.894560 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-nc6mj_f1841a2f-46f4-4fa8-8a78-0d4bb4dada45/manager/0.log" Apr 21 02:55:46.235732 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:46.235623 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-29vgg_f4891c25-20cb-4575-b7ba-5f7b41c2f140/registry-server/0.log" Apr 21 02:55:46.347032 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:46.347008 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-pj599_b00efc7b-4cd0-48ea-933b-e8bd91ebef92/manager/0.log" Apr 21 02:55:46.602995 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:46.602970 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2prqc_02dd9222-a726-441e-a972-b99b04438f38/manager/0.log" Apr 21 02:55:46.943342 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:46.943266 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fgw7wh_57b32747-a408-4d12-b996-fc025af345f9/istio-proxy/0.log" Apr 21 02:55:47.802229 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:47.802205 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs_56dd6b67-08fc-414f-9f74-a649c1dbbcd4/storage-initializer/0.log" Apr 21 02:55:47.809038 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:47.809018 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-gdqbs_56dd6b67-08fc-414f-9f74-a649c1dbbcd4/main/0.log" Apr 21 02:55:47.914161 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:47.914135 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9_af54e6da-777e-4c7b-9cd2-4b555fc0bb4e/storage-initializer/0.log" Apr 21 02:55:47.921175 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:47.921156 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-vkgx9_af54e6da-777e-4c7b-9cd2-4b555fc0bb4e/main/0.log" Apr 21 02:55:48.388095 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:48.388067 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-xb628_86175c9f-1e2d-4016-9daa-ca9c1aa97e0a/main/0.log" Apr 21 02:55:48.394559 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:48.394535 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-xb628_86175c9f-1e2d-4016-9daa-ca9c1aa97e0a/storage-initializer/0.log" Apr 21 02:55:51.890507 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:51.890461 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmjwp/must-gather-clbnt"] Apr 21 02:55:51.894483 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:51.894465 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjwp/must-gather-clbnt" Apr 21 02:55:51.897151 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:51.897123 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjwp\"/\"openshift-service-ca.crt\"" Apr 21 02:55:51.897289 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:51.897152 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmjwp\"/\"default-dockercfg-rx8jc\"" Apr 21 02:55:51.898094 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:51.898072 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjwp\"/\"kube-root-ca.crt\"" Apr 21 02:55:51.899374 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:51.899341 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjwp/must-gather-clbnt"] Apr 21 02:55:52.010471 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:52.010446 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mrtv\" (UniqueName: \"kubernetes.io/projected/9d7a35be-56e5-4ffc-9b86-48138c097859-kube-api-access-8mrtv\") pod \"must-gather-clbnt\" (UID: \"9d7a35be-56e5-4ffc-9b86-48138c097859\") " pod="openshift-must-gather-tmjwp/must-gather-clbnt" Apr 21 02:55:52.010581 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:52.010490 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d7a35be-56e5-4ffc-9b86-48138c097859-must-gather-output\") pod \"must-gather-clbnt\" (UID: \"9d7a35be-56e5-4ffc-9b86-48138c097859\") " pod="openshift-must-gather-tmjwp/must-gather-clbnt" Apr 21 02:55:52.111445 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:52.111382 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mrtv\" (UniqueName: \"kubernetes.io/projected/9d7a35be-56e5-4ffc-9b86-48138c097859-kube-api-access-8mrtv\") pod \"must-gather-clbnt\" (UID: \"9d7a35be-56e5-4ffc-9b86-48138c097859\") " pod="openshift-must-gather-tmjwp/must-gather-clbnt" Apr 21 02:55:52.111445 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:52.111425 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d7a35be-56e5-4ffc-9b86-48138c097859-must-gather-output\") pod \"must-gather-clbnt\" (UID: \"9d7a35be-56e5-4ffc-9b86-48138c097859\") " pod="openshift-must-gather-tmjwp/must-gather-clbnt" Apr 21 02:55:52.111787 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:52.111770 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d7a35be-56e5-4ffc-9b86-48138c097859-must-gather-output\") pod \"must-gather-clbnt\" (UID: \"9d7a35be-56e5-4ffc-9b86-48138c097859\") " pod="openshift-must-gather-tmjwp/must-gather-clbnt" Apr 21 02:55:52.120726 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:52.120706 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mrtv\" (UniqueName: \"kubernetes.io/projected/9d7a35be-56e5-4ffc-9b86-48138c097859-kube-api-access-8mrtv\") pod \"must-gather-clbnt\" (UID: \"9d7a35be-56e5-4ffc-9b86-48138c097859\") " pod="openshift-must-gather-tmjwp/must-gather-clbnt" Apr 21 02:55:52.229360 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:52.229294 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjwp/must-gather-clbnt" Apr 21 02:55:52.356608 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:52.356582 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjwp/must-gather-clbnt"] Apr 21 02:55:52.357842 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:55:52.357818 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7a35be_56e5_4ffc_9b86_48138c097859.slice/crio-0e2c3fa503461fa3d35af9fa701c29dba97724e785e089e4d02495d48f7b93ab WatchSource:0}: Error finding container 0e2c3fa503461fa3d35af9fa701c29dba97724e785e089e4d02495d48f7b93ab: Status 404 returned error can't find the container with id 0e2c3fa503461fa3d35af9fa701c29dba97724e785e089e4d02495d48f7b93ab Apr 21 02:55:53.355035 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:53.355000 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjwp/must-gather-clbnt" event={"ID":"9d7a35be-56e5-4ffc-9b86-48138c097859","Type":"ContainerStarted","Data":"0e2c3fa503461fa3d35af9fa701c29dba97724e785e089e4d02495d48f7b93ab"} Apr 21 02:55:54.362828 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:54.362784 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjwp/must-gather-clbnt" event={"ID":"9d7a35be-56e5-4ffc-9b86-48138c097859","Type":"ContainerStarted","Data":"f0cf218de994dfaf7501e21eea584e05134559af3912d263f388d7f5eab4129e"} Apr 21 02:55:54.363292 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:54.362837 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjwp/must-gather-clbnt" event={"ID":"9d7a35be-56e5-4ffc-9b86-48138c097859","Type":"ContainerStarted","Data":"aeb6f1b95e505ed78695fcce05a66715fba7061a6b2410ae97a7a103d292a71a"} Apr 21 02:55:54.382176 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:54.382117 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmjwp/must-gather-clbnt" podStartSLOduration=2.432336208 podStartE2EDuration="3.382098627s" podCreationTimestamp="2026-04-21 02:55:51 +0000 UTC" firstStartedPulling="2026-04-21 02:55:52.359471761 +0000 UTC m=+885.126960427" lastFinishedPulling="2026-04-21 02:55:53.309234175 +0000 UTC m=+886.076722846" observedRunningTime="2026-04-21 02:55:54.378021126 +0000 UTC m=+887.145509816" watchObservedRunningTime="2026-04-21 02:55:54.382098627 +0000 UTC m=+887.149587317" Apr 21 02:55:54.880569 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:54.880542 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qf6p4_6826d476-dc74-4776-a925-ed13337326a0/global-pull-secret-syncer/0.log" Apr 21 02:55:55.056724 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:55.056696 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zf9vv_ac41a7d4-62b5-4566-bcb2-cd59838f5120/konnectivity-agent/0.log" Apr 21 02:55:55.077719 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:55.077693 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-170.ec2.internal_59358f97ed8d625f16c9a2fd8e43b833/haproxy/0.log" Apr 21 02:55:58.859379 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:58.859334 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d_6b9875cd-7a08-475a-aee0-931f0e0008f4/extract/0.log" Apr 21 02:55:58.885660 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:58.885560 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d_6b9875cd-7a08-475a-aee0-931f0e0008f4/util/0.log" Apr 21 02:55:58.916033 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:58.915980 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594p92d_6b9875cd-7a08-475a-aee0-931f0e0008f4/pull/0.log" Apr 21 02:55:58.945647 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:58.945619 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd_e43cc539-5833-43d0-a560-b80fc0e4fa0c/extract/0.log" Apr 21 02:55:58.972103 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:58.972069 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd_e43cc539-5833-43d0-a560-b80fc0e4fa0c/util/0.log" Apr 21 02:55:58.997339 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:58.997309 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xgwsd_e43cc539-5833-43d0-a560-b80fc0e4fa0c/pull/0.log" Apr 21 02:55:59.025560 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.025529 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw_733ca859-ae1f-43e6-9b40-b2ff9828431a/extract/0.log" Apr 21 02:55:59.046548 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.046492 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw_733ca859-ae1f-43e6-9b40-b2ff9828431a/util/0.log" Apr 21 02:55:59.072523 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.072451 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q2fjw_733ca859-ae1f-43e6-9b40-b2ff9828431a/pull/0.log" Apr 21 02:55:59.111419 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.111342 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz_050bdd0b-9b38-4465-a197-62966aab158c/extract/0.log" Apr 21 02:55:59.133042 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.133017 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz_050bdd0b-9b38-4465-a197-62966aab158c/util/0.log" Apr 21 02:55:59.157408 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.157378 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t7qlz_050bdd0b-9b38-4465-a197-62966aab158c/pull/0.log" Apr 21 02:55:59.254998 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.254963 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-nc6mj_f1841a2f-46f4-4fa8-8a78-0d4bb4dada45/manager/0.log" Apr 21 02:55:59.347047 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.347014 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-29vgg_f4891c25-20cb-4575-b7ba-5f7b41c2f140/registry-server/0.log" Apr 21 02:55:59.394488 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.394402 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-pj599_b00efc7b-4cd0-48ea-933b-e8bd91ebef92/manager/0.log" Apr 21 02:55:59.458039 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:55:59.457995 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2prqc_02dd9222-a726-441e-a972-b99b04438f38/manager/0.log" Apr 21 02:56:01.084317 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.084216 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-fh8kw_d2caecf8-8d57-4c32-a979-ebd9271526a5/cluster-monitoring-operator/0.log" Apr 21 02:56:01.112346 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.112275 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xzcxn_bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8/kube-state-metrics/0.log" Apr 21 02:56:01.131330 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.131302 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xzcxn_bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8/kube-rbac-proxy-main/0.log" Apr 21 02:56:01.155624 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.155580 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xzcxn_bc9d2f69-ae25-4694-bb4a-a880e7e9d3f8/kube-rbac-proxy-self/0.log" Apr 21 02:56:01.186048 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.185902 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-64cf96cf4f-pmz7q_511a0b61-11aa-4e50-8cae-9f8fd2978a00/metrics-server/0.log" Apr 21 02:56:01.211721 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.211690 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-5qc6q_518ac0b9-1cbc-45a2-a669-593b5e3aacf4/monitoring-plugin/0.log" Apr 21 02:56:01.245733 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.245710 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6v2w9_a19ff527-8170-4894-b42e-82cf578a53b8/node-exporter/0.log" Apr 21 02:56:01.269136 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.269105 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6v2w9_a19ff527-8170-4894-b42e-82cf578a53b8/kube-rbac-proxy/0.log" Apr 21 02:56:01.303782 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.303757 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6v2w9_a19ff527-8170-4894-b42e-82cf578a53b8/init-textfile/0.log" Apr 21 02:56:01.513677 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.513643 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zjcs7_b71279ec-544a-4616-926a-e46f30b44e79/kube-rbac-proxy-main/0.log" Apr 21 02:56:01.537084 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.537051 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zjcs7_b71279ec-544a-4616-926a-e46f30b44e79/kube-rbac-proxy-self/0.log" Apr 21 02:56:01.558747 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.558717 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zjcs7_b71279ec-544a-4616-926a-e46f30b44e79/openshift-state-metrics/0.log" Apr 21 02:56:01.852589 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.852492 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9dm6g_241f3ec9-1e35-4441-b5c9-a053f2a3307c/prometheus-operator/0.log" Apr 21 02:56:01.870917 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.870889 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9dm6g_241f3ec9-1e35-4441-b5c9-a053f2a3307c/kube-rbac-proxy/0.log" Apr 21 02:56:01.932843 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.932816 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-586bd48bcb-glvsc_78ae28be-3993-4b8b-ab62-b6f09e278983/telemeter-client/0.log" Apr 21 02:56:01.957702 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.957675 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-586bd48bcb-glvsc_78ae28be-3993-4b8b-ab62-b6f09e278983/reload/0.log" Apr 21 02:56:01.977945 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:01.977922 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-586bd48bcb-glvsc_78ae28be-3993-4b8b-ab62-b6f09e278983/kube-rbac-proxy/0.log" Apr 21 02:56:03.247893 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.247872 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-8lx74_c9cd2e96-ef84-49dd-b7ca-9fb0cc36aab2/networking-console-plugin/0.log" Apr 21 02:56:03.569188 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.569156 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs"] Apr 21 02:56:03.574666 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.574641 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.584596 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.584251 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs"] Apr 21 02:56:03.632987 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.632951 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-proc\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.633132 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.633000 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6pfr\" (UniqueName: \"kubernetes.io/projected/8e286556-2058-442c-a4e4-f6cf034d66d5-kube-api-access-q6pfr\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.633132 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.633095 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-sys\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.633132 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.633119 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-podres\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.633243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.633150 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-lib-modules\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.735243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.734255 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-proc\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.735243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.734311 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6pfr\" (UniqueName: \"kubernetes.io/projected/8e286556-2058-442c-a4e4-f6cf034d66d5-kube-api-access-q6pfr\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.735243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.734404 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-sys\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.735243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.734429 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-podres\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.735243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.734463 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-lib-modules\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.735243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.734672 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-lib-modules\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.735243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.734744 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-proc\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.735243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.735099 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-sys\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.735243 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.735201 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8e286556-2058-442c-a4e4-f6cf034d66d5-podres\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.743396 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.743360 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6pfr\" (UniqueName: \"kubernetes.io/projected/8e286556-2058-442c-a4e4-f6cf034d66d5-kube-api-access-q6pfr\") pod \"perf-node-gather-daemonset-vc8vs\" (UID: \"8e286556-2058-442c-a4e4-f6cf034d66d5\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:03.891451 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:03.891342 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:04.047195 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:04.047154 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs"] Apr 21 02:56:04.049621 ip-10-0-131-170 kubenswrapper[2564]: W0421 02:56:04.049530 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8e286556_2058_442c_a4e4_f6cf034d66d5.slice/crio-f37eba2e8c8cd1bb088d788718c30bc26fcc68ca0c7e7f36097106863ff48528 WatchSource:0}: Error finding container f37eba2e8c8cd1bb088d788718c30bc26fcc68ca0c7e7f36097106863ff48528: Status 404 returned error can't find the container with id f37eba2e8c8cd1bb088d788718c30bc26fcc68ca0c7e7f36097106863ff48528 Apr 21 02:56:04.283738 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:04.283711 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bcfbcf5f6-p6r48_eb89eed7-05bb-4217-a98b-d918f675868f/console/0.log" Apr 21 02:56:04.321071 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:04.321032 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-8zx8l_56684ff1-c656-4b66-8fa3-541d09278ff9/download-server/0.log" Apr 21 02:56:04.420195 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:04.420154 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" event={"ID":"8e286556-2058-442c-a4e4-f6cf034d66d5","Type":"ContainerStarted","Data":"1243fea2a82c9d880741c1a67370aa60dab3e402856f325e6d9d7c371673fca5"} Apr 21 02:56:04.420195 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:04.420201 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" event={"ID":"8e286556-2058-442c-a4e4-f6cf034d66d5","Type":"ContainerStarted","Data":"f37eba2e8c8cd1bb088d788718c30bc26fcc68ca0c7e7f36097106863ff48528"} Apr 21 02:56:04.420436 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:04.420308 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:04.438589 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:04.438545 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" podStartSLOduration=1.438531855 podStartE2EDuration="1.438531855s" podCreationTimestamp="2026-04-21 02:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:56:04.434909283 +0000 UTC m=+897.202397976" watchObservedRunningTime="2026-04-21 02:56:04.438531855 +0000 UTC m=+897.206020543" Apr 21 02:56:04.827246 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:04.827213 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-gq5w7_f19baa1b-c32f-481f-bc3f-7a7906d3049e/volume-data-source-validator/0.log" Apr 21 02:56:05.617577 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:05.617550 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4mh7f_44863844-48fa-48ba-82c7-863e1d932c72/dns/0.log" Apr 21 02:56:05.639732 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:05.639692 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4mh7f_44863844-48fa-48ba-82c7-863e1d932c72/kube-rbac-proxy/0.log" Apr 21 02:56:05.744948 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:05.744925 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w659n_4a693e96-2533-450c-a8c8-de3f4cdfcd73/dns-node-resolver/0.log" Apr 21 02:56:06.210720 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:06.210696 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-f97sw_78ee76ed-3959-4c8b-8f2c-d057e4bd15db/node-ca/0.log" Apr 21 02:56:07.100853 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:07.100823 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fgw7wh_57b32747-a408-4d12-b996-fc025af345f9/istio-proxy/0.log" Apr 21 02:56:07.753987 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:07.753960 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wlfmp_c2a0c6c7-f298-43d3-a0c2-a33cca035a70/serve-healthcheck-canary/0.log" Apr 21 02:56:07.932382 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:07.932271 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:56:07.946082 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:07.935357 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:56:08.270831 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:08.270806 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8rwzw_75d5f030-7365-4e7d-92ef-15593dbe87f9/insights-operator/1.log" Apr 21 02:56:08.277352 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:08.277327 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8rwzw_75d5f030-7365-4e7d-92ef-15593dbe87f9/insights-operator/0.log" Apr 21 02:56:08.425344 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:08.425322 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z6lqg_7f22dea5-13ae-41ea-8a44-0677d956ef0b/kube-rbac-proxy/0.log" Apr 21 02:56:08.444077 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:08.444057 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z6lqg_7f22dea5-13ae-41ea-8a44-0677d956ef0b/exporter/0.log" Apr 21 02:56:08.463602 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:08.463584 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z6lqg_7f22dea5-13ae-41ea-8a44-0677d956ef0b/extractor/0.log" Apr 21 02:56:10.317370 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:10.317344 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-ff9579c68-hb99r_0dbabaa1-121a-41fd-ae64-6aa58e065163/maas-api/0.log" Apr 21 02:56:10.343587 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:10.343555 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-fbc547765-wdfkp_bcd6fd77-71fe-4c75-bc81-f8ee3d33e5eb/manager/0.log" Apr 21 02:56:10.392969 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:10.392943 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f4d6bff-5ppcc_d3385a01-1bf2-49ca-b232-f651e0f598f1/manager/0.log" Apr 21 02:56:10.438147 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:10.438114 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-vc8vs" Apr 21 02:56:11.732659 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:11.732631 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-64dc57f969-r8kj9_69f41614-cb4c-43c1-86b6-431d8ffb9de8/manager/0.log" Apr 21 02:56:11.755607 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:11.755575 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-86vw7_d957ce2d-1bc6-4ca8-b1bd-fbb0d792a669/openshift-lws-operator/0.log" Apr 21 02:56:16.207240 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:16.207199 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-6lqvm_874628bd-c537-4903-b0dd-c66cce097f9e/migrator/0.log" Apr 21 02:56:16.226418 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:16.226391 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-6lqvm_874628bd-c537-4903-b0dd-c66cce097f9e/graceful-termination/0.log" Apr 21 02:56:17.620584 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:17.620559 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2vcn8_d9fa5381-4556-4f6c-91d2-c5b4580df414/kube-multus-additional-cni-plugins/0.log" Apr 21 02:56:17.644355 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:17.644322 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2vcn8_d9fa5381-4556-4f6c-91d2-c5b4580df414/egress-router-binary-copy/0.log" Apr 21 02:56:17.667364 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:17.667340 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2vcn8_d9fa5381-4556-4f6c-91d2-c5b4580df414/cni-plugins/0.log" Apr 21 02:56:17.691531 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:17.691491 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2vcn8_d9fa5381-4556-4f6c-91d2-c5b4580df414/bond-cni-plugin/0.log" Apr 21 02:56:17.714020 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:17.714004 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2vcn8_d9fa5381-4556-4f6c-91d2-c5b4580df414/routeoverride-cni/0.log" Apr 21 02:56:17.737615 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:17.737593 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2vcn8_d9fa5381-4556-4f6c-91d2-c5b4580df414/whereabouts-cni-bincopy/0.log" Apr 21 02:56:17.756546 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:17.756525 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2vcn8_d9fa5381-4556-4f6c-91d2-c5b4580df414/whereabouts-cni/0.log" Apr 21 02:56:18.133455 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:18.133425 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v22pg_9cb70894-b83f-4e21-9932-c5cb64320169/kube-multus/0.log" Apr 21 02:56:18.193857 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:18.193831 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-77bqp_da0d17ad-bc94-4499-bb04-b7e0df549a24/network-metrics-daemon/0.log" Apr 21 02:56:18.211451 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:18.211429 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-77bqp_da0d17ad-bc94-4499-bb04-b7e0df549a24/kube-rbac-proxy/0.log" Apr 21 02:56:19.606633 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:19.606607 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-controller/0.log" Apr 21 02:56:19.622899 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:19.622878 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/0.log" Apr 21 02:56:19.627988 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:19.627972 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovn-acl-logging/1.log" Apr 21 02:56:19.647054 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:19.647034 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/kube-rbac-proxy-node/0.log" Apr 21 02:56:19.666174 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:19.666144 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 02:56:19.684201 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:19.684182 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/northd/0.log" Apr 21 02:56:19.703126 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:19.703106 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/nbdb/0.log" Apr 21 02:56:19.722961 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:19.722941 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/sbdb/0.log" Apr 21 02:56:19.845041 ip-10-0-131-170 kubenswrapper[2564]: I0421 02:56:19.845001 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8b6t_5071ecdc-0d05-412f-b12d-1289b06373ec/ovnkube-controller/0.log"