Apr 20 17:45:52.236467 ip-10-0-135-49 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 17:45:52.236478 ip-10-0-135-49 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 17:45:52.236485 ip-10-0-135-49 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 17:45:52.236758 ip-10-0-135-49 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 17:46:02.439053 ip-10-0-135-49 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 17:46:02.439073 ip-10-0-135-49 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 847674c369e74892ab9db0c409123203 -- Apr 20 17:48:18.112342 ip-10-0-135-49 systemd[1]: Starting Kubernetes Kubelet... Apr 20 17:48:18.482178 ip-10-0-135-49 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 17:48:18.482178 ip-10-0-135-49 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 17:48:18.482178 ip-10-0-135-49 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 17:48:18.482178 ip-10-0-135-49 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 17:48:18.482178 ip-10-0-135-49 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 17:48:18.485283 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.485192 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 17:48:18.487467 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487452 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:18.487467 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487467 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487471 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487476 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487479 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487482 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487485 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487488 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487491 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487493 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487496 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487504 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487507 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487510 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487512 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487515 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487518 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487520 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487522 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487525 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487528 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:18.487531 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487531 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487533 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487536 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487539 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487542 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487544 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487547 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487550 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487553 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487555 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487557 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487560 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487563 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487565 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487568 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487570 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487573 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487575 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487578 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487580 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:18.488153 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487583 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487586 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487588 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487591 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487595 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487597 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487600 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487602 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487605 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487607 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487609 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487612 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487614 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487617 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487621 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487624 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487626 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487628 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487631 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487633 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:18.488687 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487636 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487638 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487641 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487643 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487645 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487648 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487650 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487653 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487655 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487658 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487661 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487663 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487666 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487668 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487673 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487676 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487679 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487682 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487684 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:18.489220 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487688 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487691 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487694 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487697 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487699 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.487701 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488114 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488121 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488125 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488128 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488131 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488133 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488137 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488139 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488142 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488145 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488148 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488150 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488154 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:18.489972 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488158 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488161 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488165 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488168 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488171 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488174 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488177 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488180 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488182 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488185 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488188 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488191 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488193 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488196 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488199 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488201 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488204 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488206 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488209 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:18.490808 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488211 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488214 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488217 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488219 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488222 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488224 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488227 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488230 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488232 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488235 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488237 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488240 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488242 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488244 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488247 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488249 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488252 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488254 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488257 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488260 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:18.491420 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488262 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488265 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488268 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488270 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488274 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488276 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488278 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488281 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488284 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488286 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488289 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488291 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488294 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488296 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488299 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488301 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488304 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488306 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488308 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488311 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:18.491941 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488315 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488320 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488324 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488328 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488332 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488336 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488340 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488344 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488346 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488349 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488353 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488356 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488359 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.488362 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489056 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489068 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489095 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489104 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489108 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489112 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489117 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 17:48:18.492451 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489122 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489125 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489128 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489135 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489139 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489142 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489145 2577 flags.go:64] FLAG: --cgroup-root="" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489148 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489151 2577 flags.go:64] FLAG: --client-ca-file="" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489154 2577 flags.go:64] FLAG: --cloud-config="" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489157 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489159 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489163 2577 flags.go:64] FLAG: --cluster-domain="" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489166 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489169 2577 flags.go:64] FLAG: --config-dir="" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489172 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489176 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489180 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489183 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489186 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489190 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489194 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489197 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489200 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489204 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 17:48:18.492963 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489206 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489211 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489214 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489217 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489221 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489224 2577 flags.go:64] FLAG: --enable-server="true" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489227 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489232 2577 flags.go:64] FLAG: --event-burst="100" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489235 2577 flags.go:64] FLAG: --event-qps="50" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489238 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489242 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489245 2577 flags.go:64] FLAG: --eviction-hard="" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489249 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489252 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489255 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489258 2577 flags.go:64] FLAG: --eviction-soft="" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489261 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489264 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489267 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489270 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489273 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489276 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489279 2577 flags.go:64] FLAG: --feature-gates="" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489282 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489285 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 17:48:18.493598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489288 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489292 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489295 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489299 2577 flags.go:64] FLAG: --help="false" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489302 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489305 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489308 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489311 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489314 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489318 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489321 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489323 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489327 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489329 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489332 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489335 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489338 2577 flags.go:64] FLAG: --kube-reserved="" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489343 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489346 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489349 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489352 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489355 2577 flags.go:64] FLAG: --lock-file="" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489357 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489360 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 17:48:18.494213 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489363 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489369 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489372 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489374 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489377 2577 flags.go:64] FLAG: --logging-format="text" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489380 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489383 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489386 2577 flags.go:64] FLAG: --manifest-url="" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489389 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489394 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489397 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489402 2577 flags.go:64] FLAG: --max-pods="110" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489405 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489408 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489411 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489415 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489418 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489421 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489424 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489433 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489436 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489440 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489443 2577 flags.go:64] FLAG: --pod-cidr="" Apr 20 17:48:18.494820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489446 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489602 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489617 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489626 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489633 2577 flags.go:64] FLAG: --port="10250" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489640 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.489647 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a6912dea7f0ed458" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490374 2577 flags.go:64] FLAG: --qos-reserved="" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490384 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490392 2577 flags.go:64] FLAG: --register-node="true" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490399 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490404 2577 flags.go:64] FLAG: --register-with-taints="" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490430 2577 flags.go:64] FLAG: --registry-burst="10" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490435 2577 flags.go:64] FLAG: --registry-qps="5" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490441 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490452 2577 flags.go:64] FLAG: --reserved-memory="" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490458 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490463 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490469 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490474 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490478 2577 flags.go:64] FLAG: --runonce="false" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490483 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490489 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490499 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490504 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490509 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 17:48:18.495443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490514 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490520 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490525 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490529 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490534 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490539 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490550 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490555 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490560 2577 flags.go:64] FLAG: --system-cgroups="" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490574 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490592 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490597 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490608 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490616 2577 flags.go:64] FLAG: --tls-min-version="" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490622 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490627 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490632 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490637 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490642 2577 flags.go:64] FLAG: --v="2" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490648 2577 flags.go:64] FLAG: --version="false" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490655 2577 flags.go:64] FLAG: --vmodule="" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490666 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.490672 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491149 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:18.496084 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491159 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491163 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491166 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491169 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491173 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491175 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491178 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491181 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491183 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491187 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491189 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491192 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491195 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491198 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491200 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491203 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491206 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491208 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491211 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491213 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:18.496660 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491216 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491219 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491221 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491225 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491228 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491231 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491233 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491236 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491238 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491241 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491244 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491246 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491250 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491253 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491256 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491259 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491261 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491264 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491267 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:18.497206 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491269 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491272 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491276 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491280 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491283 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491286 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491289 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491291 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491294 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491297 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491299 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491302 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491304 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491307 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491310 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491312 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491315 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491318 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491321 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491323 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:18.497702 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491325 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491328 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491330 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491333 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491335 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491338 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491341 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491345 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491350 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491353 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491356 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491359 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491362 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491366 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491368 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491371 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491373 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491376 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491378 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:18.498194 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491381 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491384 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491386 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491389 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491391 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491394 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.491396 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.492131 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.498511 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.498528 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498580 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498586 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498590 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498593 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498596 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498599 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:18.498661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498601 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498604 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498607 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498610 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498612 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498615 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498617 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498620 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498623 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498626 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498628 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498633 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498638 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498641 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498644 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498647 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498650 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498653 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498656 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498659 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:18.499121 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498661 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498664 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498667 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498669 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498672 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498676 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498679 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498682 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498684 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498687 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498689 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498692 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498694 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498697 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498700 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498703 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498705 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498708 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498710 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498713 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:18.499614 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498716 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498718 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498721 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498724 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498726 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498728 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498731 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498734 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498737 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498739 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498742 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498744 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498746 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498749 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498751 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498754 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498756 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498759 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498762 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:18.500136 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498765 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498769 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498772 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498775 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498777 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498779 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498782 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498785 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498788 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498790 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498793 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498795 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498798 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498800 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498803 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498805 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498808 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498810 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498813 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:18.500604 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498815 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498818 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.498823 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498933 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498938 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498941 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498944 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498946 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498949 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498951 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498954 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498956 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498959 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498962 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498964 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:18.501103 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498967 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498969 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498972 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498974 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498976 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.498980 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499003 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499005 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499008 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499011 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499014 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499016 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499019 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499021 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499024 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499027 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499029 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499032 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499034 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499037 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:18.501482 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499039 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499042 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499045 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499047 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499050 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499052 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499055 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499058 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499060 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499065 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499070 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499074 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499077 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499080 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499082 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499085 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499087 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499091 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499093 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:18.501994 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499096 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499098 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499101 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499103 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499106 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499109 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499111 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499114 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499116 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499119 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499121 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499124 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499127 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499130 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499132 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499136 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499138 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499141 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499143 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499146 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:18.502460 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499148 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499151 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499153 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499156 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499158 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499168 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499171 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499174 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499176 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499179 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499185 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499187 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499190 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499193 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:18.499195 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.499200 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 17:48:18.503099 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.499790 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 17:48:18.504491 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.504478 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 17:48:18.505204 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.505193 2577 server.go:1019] "Starting client certificate rotation" Apr 20 17:48:18.505307 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.505289 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 17:48:18.505353 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.505329 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 17:48:18.525582 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.525563 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 17:48:18.528952 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.528933 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 17:48:18.544218 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.544195 2577 log.go:25] "Validated CRI v1 runtime API" Apr 20 17:48:18.548967 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.548949 2577 log.go:25] "Validated CRI v1 image API" Apr 20 17:48:18.550679 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.550656 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 17:48:18.553456 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.553438 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 17:48:18.553544 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.553478 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 82280ee4-0f32-4cd5-b7e5-f583dd33aef1:/dev/nvme0n1p4 a9bbd970-6c57-4545-9962-b15612f0fabb:/dev/nvme0n1p3] Apr 20 17:48:18.553544 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.553504 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 17:48:18.559088 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.558965 2577 manager.go:217] Machine: {Timestamp:2026-04-20 17:48:18.557371029 +0000 UTC m=+0.342726259 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3077057 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25bc3b08028db9890fd662626187cc SystemUUID:ec25bc3b-0802-8db9-890f-d662626187cc BootID:847674c3-69e7-4892-ab9d-b0c409123203 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:28:9d:b0:41:9b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:28:9d:b0:41:9b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:18:58:0e:1d:01 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 17:48:18.559088 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.559082 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 17:48:18.559199 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.559169 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 17:48:18.560836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.560809 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 17:48:18.560974 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.560840 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-49.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 17:48:18.561041 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.560997 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 17:48:18.561041 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.561007 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 17:48:18.561041 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.561021 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 17:48:18.561788 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.561778 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 17:48:18.562894 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.562883 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 17:48:18.563040 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.563031 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 17:48:18.565053 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.565041 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 20 17:48:18.565095 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.565059 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 17:48:18.565095 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.565073 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 17:48:18.565095 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.565083 2577 kubelet.go:397] "Adding apiserver pod source" Apr 20 17:48:18.565095 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.565092 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 17:48:18.566079 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.566066 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 17:48:18.566126 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.566091 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 17:48:18.568717 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.568696 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 17:48:18.570402 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.570385 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 17:48:18.571532 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571517 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 17:48:18.571616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571538 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 17:48:18.571616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571548 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 17:48:18.571616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571556 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 17:48:18.571616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571565 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 17:48:18.571616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571574 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 17:48:18.571616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571582 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 17:48:18.571616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571591 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 17:48:18.571616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571603 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 17:48:18.571616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571613 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 17:48:18.571888 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571641 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 17:48:18.571888 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.571657 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 17:48:18.572406 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.572394 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 17:48:18.572461 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.572409 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 17:48:18.575187 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.575150 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ktv4j" Apr 20 17:48:18.575954 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.575941 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 17:48:18.576034 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.575975 2577 server.go:1295] "Started kubelet" Apr 20 17:48:18.576091 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.576066 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 17:48:18.576186 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.576142 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 17:48:18.576225 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.576203 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 17:48:18.576877 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.576854 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-49.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 17:48:18.576929 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.576881 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-49.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 17:48:18.576916 ip-10-0-135-49 systemd[1]: Started Kubernetes Kubelet. Apr 20 17:48:18.577080 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.576938 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 17:48:18.577532 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.577490 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 17:48:18.578825 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.578807 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 20 17:48:18.582522 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.582499 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 17:48:18.582616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.582527 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ktv4j" Apr 20 17:48:18.582958 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.582944 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 17:48:18.584829 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.584807 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 17:48:18.584902 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.584861 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 17:48:18.584902 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.584878 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 17:48:18.585097 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.585078 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:18.585215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.585201 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 20 17:48:18.585215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.585213 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 20 17:48:18.585494 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.585470 2577 factory.go:55] Registering systemd factory Apr 20 17:48:18.585581 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.585502 2577 factory.go:223] Registration of the systemd container factory successfully Apr 20 17:48:18.585709 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.585692 2577 factory.go:153] Registering CRI-O factory Apr 20 17:48:18.585777 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.585712 2577 factory.go:223] Registration of the crio container factory successfully Apr 20 17:48:18.585777 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.585763 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 17:48:18.585777 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.584891 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-49.ec2.internal.18a821dfe976aef7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-49.ec2.internal,UID:ip-10-0-135-49.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-49.ec2.internal,},FirstTimestamp:2026-04-20 17:48:18.575953655 +0000 UTC m=+0.361308886,LastTimestamp:2026-04-20 17:48:18.575953655 +0000 UTC m=+0.361308886,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-49.ec2.internal,}" Apr 20 17:48:18.585936 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.585784 2577 factory.go:103] Registering Raw factory Apr 20 17:48:18.585936 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.585843 2577 manager.go:1196] Started watching for new ooms in manager Apr 20 17:48:18.586354 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.586334 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 17:48:18.586450 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.586370 2577 manager.go:319] Starting recovery of all containers Apr 20 17:48:18.592337 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.592318 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:18.595049 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.595027 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-49.ec2.internal\" not found" node="ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.597678 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.597498 2577 manager.go:324] Recovery completed Apr 20 17:48:18.599350 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.599296 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 17:48:18.602438 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.602423 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:18.604886 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.604871 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:18.604948 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.604898 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:18.604948 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.604908 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:18.605480 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.605463 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 17:48:18.605480 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.605475 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 17:48:18.605587 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.605492 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 17:48:18.608522 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.608511 2577 policy_none.go:49] "None policy: Start" Apr 20 17:48:18.608578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.608526 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 17:48:18.608578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.608535 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 20 17:48:18.650640 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.650621 2577 manager.go:341] "Starting Device Plugin manager" Apr 20 17:48:18.665555 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.650668 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 17:48:18.665555 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.650681 2577 server.go:85] "Starting device plugin registration server" Apr 20 17:48:18.665555 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.650973 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 17:48:18.665555 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.651001 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 17:48:18.665555 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.651242 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 17:48:18.665555 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.651391 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 17:48:18.665555 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.651401 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 17:48:18.665555 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.651762 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 17:48:18.665555 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.651793 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:18.689260 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.689226 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 17:48:18.690339 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.690322 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 17:48:18.690438 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.690346 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 17:48:18.690438 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.690375 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 17:48:18.690438 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.690384 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 17:48:18.690438 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.690424 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 17:48:18.693583 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.693492 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:18.751244 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.751164 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:18.752188 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.752165 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:18.752290 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.752201 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:18.752290 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.752213 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:18.752290 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.752250 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.758485 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.758469 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.758568 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.758492 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-49.ec2.internal\": node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:18.774114 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.774080 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:18.791066 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.791023 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal"] Apr 20 17:48:18.791174 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.791108 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:18.791944 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.791926 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:18.792040 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.791959 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:18.792040 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.791969 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:18.794430 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.794416 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:18.794571 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.794548 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.794621 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.794588 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:18.795114 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.795098 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:18.795215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.795116 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:18.795215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.795127 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:18.795215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.795137 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:18.795215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.795141 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:18.795215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.795147 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:18.797365 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.797349 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.797452 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.797378 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:18.797997 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.797970 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:18.798066 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.798007 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:18.798066 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.798030 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:18.819609 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.819589 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-49.ec2.internal\" not found" node="ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.823972 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.823957 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-49.ec2.internal\" not found" node="ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.874827 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.874799 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:18.887704 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.887682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3acadad2e761e85308a5de0dbfa7268b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal\" (UID: \"3acadad2e761e85308a5de0dbfa7268b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.887767 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.887710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3acadad2e761e85308a5de0dbfa7268b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal\" (UID: \"3acadad2e761e85308a5de0dbfa7268b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.887767 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.887728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/99727bb875db005dc5ab40f5d2dc2824-config\") pod \"kube-apiserver-proxy-ip-10-0-135-49.ec2.internal\" (UID: \"99727bb875db005dc5ab40f5d2dc2824\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.975252 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:18.975225 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:18.988605 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.988580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3acadad2e761e85308a5de0dbfa7268b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal\" (UID: \"3acadad2e761e85308a5de0dbfa7268b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.988684 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.988612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3acadad2e761e85308a5de0dbfa7268b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal\" (UID: \"3acadad2e761e85308a5de0dbfa7268b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.988684 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.988628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/99727bb875db005dc5ab40f5d2dc2824-config\") pod \"kube-apiserver-proxy-ip-10-0-135-49.ec2.internal\" (UID: \"99727bb875db005dc5ab40f5d2dc2824\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.988756 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.988678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3acadad2e761e85308a5de0dbfa7268b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal\" (UID: \"3acadad2e761e85308a5de0dbfa7268b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.988756 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.988727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/99727bb875db005dc5ab40f5d2dc2824-config\") pod \"kube-apiserver-proxy-ip-10-0-135-49.ec2.internal\" (UID: \"99727bb875db005dc5ab40f5d2dc2824\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal" Apr 20 17:48:18.988824 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:18.988755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3acadad2e761e85308a5de0dbfa7268b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal\" (UID: \"3acadad2e761e85308a5de0dbfa7268b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" Apr 20 17:48:19.076024 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:19.075942 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:19.121441 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.121402 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" Apr 20 17:48:19.125847 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.125824 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal" Apr 20 17:48:19.176357 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:19.176325 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:19.276816 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:19.276787 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:19.377299 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:19.377219 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:19.477710 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:19.477677 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:19.505133 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.505105 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 17:48:19.505640 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.505239 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 17:48:19.505640 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.505274 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 17:48:19.577881 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:19.577853 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-49.ec2.internal\" not found" Apr 20 17:48:19.583166 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.583148 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 17:48:19.585542 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.585511 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 17:43:18 +0000 UTC" deadline="2028-02-06 10:16:23.747223596 +0000 UTC" Apr 20 17:48:19.585598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.585542 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15760h28m4.161684259s" Apr 20 17:48:19.589520 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.589502 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:19.594314 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.594296 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 17:48:19.616061 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.616038 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9cztl" Apr 20 17:48:19.622190 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.622170 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9cztl" Apr 20 17:48:19.652051 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:19.652025 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3acadad2e761e85308a5de0dbfa7268b.slice/crio-8ba127437e1bff1bdadc6301b441f1694086ad56d65df880a0d058a6a092b3d9 WatchSource:0}: Error finding container 8ba127437e1bff1bdadc6301b441f1694086ad56d65df880a0d058a6a092b3d9: Status 404 returned error can't find the container with id 8ba127437e1bff1bdadc6301b441f1694086ad56d65df880a0d058a6a092b3d9 Apr 20 17:48:19.652391 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:19.652374 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99727bb875db005dc5ab40f5d2dc2824.slice/crio-387b89700a8ad604c403451c1e3b03b749919f8e9282e6d267f33bf41d586d41 WatchSource:0}: Error finding container 387b89700a8ad604c403451c1e3b03b749919f8e9282e6d267f33bf41d586d41: Status 404 returned error can't find the container with id 387b89700a8ad604c403451c1e3b03b749919f8e9282e6d267f33bf41d586d41 Apr 20 17:48:19.655927 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.655913 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 17:48:19.685029 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.685010 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal" Apr 20 17:48:19.693814 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.693770 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" event={"ID":"3acadad2e761e85308a5de0dbfa7268b","Type":"ContainerStarted","Data":"8ba127437e1bff1bdadc6301b441f1694086ad56d65df880a0d058a6a092b3d9"} Apr 20 17:48:19.693959 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.693941 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 17:48:19.694765 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.694745 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal" event={"ID":"99727bb875db005dc5ab40f5d2dc2824","Type":"ContainerStarted","Data":"387b89700a8ad604c403451c1e3b03b749919f8e9282e6d267f33bf41d586d41"} Apr 20 17:48:19.695506 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.695492 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" Apr 20 17:48:19.704520 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:19.704502 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 17:48:20.150047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.150017 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:20.372759 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.372718 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:20.566109 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.566011 2577 apiserver.go:52] "Watching apiserver" Apr 20 17:48:20.574233 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.574192 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 17:48:20.574602 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.574580 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6k2dk","openshift-image-registry/node-ca-6dlf2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal","openshift-multus/multus-6tdx8","openshift-multus/multus-additional-cni-plugins-m2q9r","openshift-multus/network-metrics-daemon-skq27","openshift-network-diagnostics/network-check-target-9whfr","openshift-network-operator/iptables-alerter-fdrzw","kube-system/konnectivity-agent-l85wm","kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj","openshift-cluster-node-tuning-operator/tuned-2r4zf","openshift-ovn-kubernetes/ovnkube-node-mxb8f"] Apr 20 17:48:20.577035 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.577012 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:20.577148 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:20.577116 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:20.581377 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.581351 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.581817 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.581753 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.584332 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.584068 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 17:48:20.584475 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.584452 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 17:48:20.585512 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.585157 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-r8fd2\"" Apr 20 17:48:20.585512 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.585178 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 17:48:20.585512 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.585202 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 17:48:20.585512 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.585369 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 17:48:20.585512 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.585383 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jmtxs\"" Apr 20 17:48:20.585512 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.585420 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 17:48:20.585512 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.585470 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 17:48:20.587597 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.587577 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.589633 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.589593 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 17:48:20.589887 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.589786 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:20.589887 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.589844 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cnhpv\"" Apr 20 17:48:20.589887 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.589853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 17:48:20.589887 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:20.589858 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:20.589887 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.589872 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.592093 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.592051 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 17:48:20.592197 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.592114 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 17:48:20.592197 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.592124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.592197 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.592155 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8fp5d\"" Apr 20 17:48:20.594266 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.594242 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:48:20.594345 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.594323 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 17:48:20.594409 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.594355 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:20.594776 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.594758 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jd47w\"" Apr 20 17:48:20.594864 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.594832 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 17:48:20.596422 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596399 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 17:48:20.596422 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596413 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 17:48:20.596564 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596430 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wt47c\"" Apr 20 17:48:20.596765 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7b143fe4-9d02-4ed6-a139-f8b9c51e336d-serviceca\") pod \"node-ca-6dlf2\" (UID: \"7b143fe4-9d02-4ed6-a139-f8b9c51e336d\") " pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.596838 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579m2\" (UniqueName: \"kubernetes.io/projected/7b143fe4-9d02-4ed6-a139-f8b9c51e336d-kube-api-access-579m2\") pod \"node-ca-6dlf2\" (UID: \"7b143fe4-9d02-4ed6-a139-f8b9c51e336d\") " pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.596838 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-system-cni-dir\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.596940 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-os-release\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.596940 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-run-netns\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.596940 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-var-lib-cni-bin\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.596940 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-system-cni-dir\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.597149 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596865 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.597149 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.596958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-os-release\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.597149 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.597149 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-run-k8s-cni-cncf-io\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597149 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597107 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-etc-kubernetes\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597149 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.597149 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.597467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b143fe4-9d02-4ed6-a139-f8b9c51e336d-host\") pod \"node-ca-6dlf2\" (UID: \"7b143fe4-9d02-4ed6-a139-f8b9c51e336d\") " pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.597467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597199 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-cnibin\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/774bfb19-2861-479a-a336-756a6a8d2926-multus-daemon-config\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-cnibin\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.597467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-multus-cni-dir\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-multus-socket-dir-parent\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597492 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-var-lib-kubelet\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gswz\" (UniqueName: \"kubernetes.io/projected/774bfb19-2861-479a-a336-756a6a8d2926-kube-api-access-6gswz\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cclhr\" (UniqueName: \"kubernetes.io/projected/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-kube-api-access-cclhr\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.597750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxj6\" (UniqueName: \"kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6\") pod \"network-check-target-9whfr\" (UID: \"bd203605-978e-4434-8717-9734789f5af0\") " pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:20.597750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597615 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-var-lib-cni-multus\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597650 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-multus-conf-dir\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-run-multus-certs\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/774bfb19-2861-479a-a336-756a6a8d2926-cni-binary-copy\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.597750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-hostroot\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.598250 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.597770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.598972 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.598954 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-s8pqs\"" Apr 20 17:48:20.598972 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.598960 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 17:48:20.599105 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.599063 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 17:48:20.599271 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.599256 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.599345 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.599303 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 17:48:20.601353 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.601326 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:48:20.601443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.601378 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dgw2g\"" Apr 20 17:48:20.601443 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.601380 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 17:48:20.601579 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.601563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.603696 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.603673 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-77kwv\"" Apr 20 17:48:20.604158 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.604012 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 17:48:20.604158 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.604030 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 17:48:20.604158 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.604046 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 17:48:20.604158 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.604103 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 17:48:20.604158 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.604123 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 17:48:20.604506 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.604492 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 17:48:20.623379 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.623349 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 17:43:19 +0000 UTC" deadline="2028-01-11 08:10:49.409671218 +0000 UTC" Apr 20 17:48:20.623379 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.623379 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15134h22m28.786296182s" Apr 20 17:48:20.685841 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.685811 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 17:48:20.698018 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.697972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-multus-socket-dir-parent\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698192 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.698192 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698069 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-etc-openvswitch\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.698192 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698066 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-multus-socket-dir-parent\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698192 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxj6\" (UniqueName: \"kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6\") pod \"network-check-target-9whfr\" (UID: \"bd203605-978e-4434-8717-9734789f5af0\") " pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:20.698192 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-var-lib-cni-multus\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698192 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-sysconfig\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.698471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698201 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-sysctl-conf\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.698471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-run-netns\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.698471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c38c27a-adb3-46fb-9409-cec659c7a3c1-ovnkube-config\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.698471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698360 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-hostroot\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-var-lib-cni-multus\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.698471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-hostroot\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-kubelet-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.698471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698473 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-device-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698493 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-sys\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c38c27a-adb3-46fb-9409-cec659c7a3c1-env-overrides\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-var-lib-kubelet\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-slash\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7b143fe4-9d02-4ed6-a139-f8b9c51e336d-serviceca\") pod \"node-ca-6dlf2\" (UID: \"7b143fe4-9d02-4ed6-a139-f8b9c51e336d\") " pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-system-cni-dir\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698684 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-os-release\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-run-netns\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-run-netns\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698780 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-system-cni-dir\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-var-lib-cni-bin\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-system-cni-dir\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698832 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-os-release\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-os-release\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.698861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-var-lib-cni-bin\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698870 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-run\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-lib-modules\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-system-cni-dir\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-os-release\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.698967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5x4p\" (UniqueName: \"kubernetes.io/projected/43e3018f-2822-49c0-af54-f203162e6017-kube-api-access-d5x4p\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699025 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-run-openvswitch\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699050 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c38c27a-adb3-46fb-9409-cec659c7a3c1-ovnkube-script-lib\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-etc-kubernetes\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-log-socket\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7b143fe4-9d02-4ed6-a139-f8b9c51e336d-serviceca\") pod \"node-ca-6dlf2\" (UID: \"7b143fe4-9d02-4ed6-a139-f8b9c51e336d\") " pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0c2d21f8-03e3-423b-a4e7-4ab1bd770001-agent-certs\") pod \"konnectivity-agent-l85wm\" (UID: \"0c2d21f8-03e3-423b-a4e7-4ab1bd770001\") " pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-etc-kubernetes\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b143fe4-9d02-4ed6-a139-f8b9c51e336d-host\") pod \"node-ca-6dlf2\" (UID: \"7b143fe4-9d02-4ed6-a139-f8b9c51e336d\") " pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-cnibin\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.699521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/774bfb19-2861-479a-a336-756a6a8d2926-multus-daemon-config\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/43e3018f-2822-49c0-af54-f203162e6017-etc-tuned\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b143fe4-9d02-4ed6-a139-f8b9c51e336d-host\") pod \"node-ca-6dlf2\" (UID: \"7b143fe4-9d02-4ed6-a139-f8b9c51e336d\") " pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-cni-bin\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbc16f1f-e425-42a6-9352-b92e465bc2c2-tmp-dir\") pod \"node-resolver-6k2dk\" (UID: \"bbc16f1f-e425-42a6-9352-b92e465bc2c2\") " pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-cnibin\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27f9l\" (UniqueName: \"kubernetes.io/projected/bbc16f1f-e425-42a6-9352-b92e465bc2c2-kube-api-access-27f9l\") pod \"node-resolver-6k2dk\" (UID: \"bbc16f1f-e425-42a6-9352-b92e465bc2c2\") " pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-multus-cni-dir\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z92xn\" (UniqueName: \"kubernetes.io/projected/14ea9252-57e5-4e09-9c9e-33d96e94d56f-kube-api-access-z92xn\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5f9d\" (UniqueName: \"kubernetes.io/projected/72925cbc-8caa-4bb0-8945-2fb6210c31e7-kube-api-access-k5f9d\") pod \"iptables-alerter-fdrzw\" (UID: \"72925cbc-8caa-4bb0-8945-2fb6210c31e7\") " pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699570 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-multus-cni-dir\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-var-lib-kubelet\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gswz\" (UniqueName: \"kubernetes.io/projected/774bfb19-2861-479a-a336-756a6a8d2926-kube-api-access-6gswz\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-var-lib-kubelet\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699640 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cclhr\" (UniqueName: \"kubernetes.io/projected/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-kube-api-access-cclhr\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699613 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699670 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-systemd\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.700203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699699 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-kubelet\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-multus-conf-dir\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-run-multus-certs\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-socket-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-multus-conf-dir\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-run-multus-certs\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-etc-selinux\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-cni-netd\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c38c27a-adb3-46fb-9409-cec659c7a3c1-ovn-node-metrics-cert\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699902 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/774bfb19-2861-479a-a336-756a6a8d2926-multus-daemon-config\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72925cbc-8caa-4bb0-8945-2fb6210c31e7-host-slash\") pod \"iptables-alerter-fdrzw\" (UID: \"72925cbc-8caa-4bb0-8945-2fb6210c31e7\") " pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/774bfb19-2861-479a-a336-756a6a8d2926-cni-binary-copy\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.699973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-kubernetes\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-run-systemd\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv4m5\" (UniqueName: \"kubernetes.io/projected/2c38c27a-adb3-46fb-9409-cec659c7a3c1-kube-api-access-rv4m5\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0c2d21f8-03e3-423b-a4e7-4ab1bd770001-konnectivity-ca\") pod \"konnectivity-agent-l85wm\" (UID: \"0c2d21f8-03e3-423b-a4e7-4ab1bd770001\") " pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:20.700923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-systemd-units\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-579m2\" (UniqueName: \"kubernetes.io/projected/7b143fe4-9d02-4ed6-a139-f8b9c51e336d-kube-api-access-579m2\") pod \"node-ca-6dlf2\" (UID: \"7b143fe4-9d02-4ed6-a139-f8b9c51e336d\") " pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-sys-fs\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/774bfb19-2861-479a-a336-756a6a8d2926-cni-binary-copy\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700474 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/72925cbc-8caa-4bb0-8945-2fb6210c31e7-iptables-alerter-script\") pod \"iptables-alerter-fdrzw\" (UID: \"72925cbc-8caa-4bb0-8945-2fb6210c31e7\") " pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700538 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-registration-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-sysctl-d\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-host\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-var-lib-openvswitch\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-run-ovn\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700690 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-node-log\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bbc16f1f-e425-42a6-9352-b92e465bc2c2-hosts-file\") pod \"node-resolver-6k2dk\" (UID: \"bbc16f1f-e425-42a6-9352-b92e465bc2c2\") " pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-run-k8s-cni-cncf-io\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.701538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/774bfb19-2861-479a-a336-756a6a8d2926-host-run-k8s-cni-cncf-io\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.702045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzx7q\" (UniqueName: \"kubernetes.io/projected/b5439cb9-e073-43b0-a160-d1509081e674-kube-api-access-tzx7q\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.702045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700875 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43e3018f-2822-49c0-af54-f203162e6017-tmp\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.702045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-cnibin\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.702045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-modprobe-d\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.702045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.702045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.702045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.700997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-cnibin\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.703332 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:20.703313 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:20.703407 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:20.703336 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:20.703407 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:20.703349 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxj6 for pod openshift-network-diagnostics/network-check-target-9whfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:20.703475 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:20.703425 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6 podName:bd203605-978e-4434-8717-9734789f5af0 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:21.203399741 +0000 UTC m=+2.988754963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxj6" (UniqueName: "kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6") pod "network-check-target-9whfr" (UID: "bd203605-978e-4434-8717-9734789f5af0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:20.708388 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.708357 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 17:48:20.712197 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.712173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-579m2\" (UniqueName: \"kubernetes.io/projected/7b143fe4-9d02-4ed6-a139-f8b9c51e336d-kube-api-access-579m2\") pod \"node-ca-6dlf2\" (UID: \"7b143fe4-9d02-4ed6-a139-f8b9c51e336d\") " pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.712308 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.712220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gswz\" (UniqueName: \"kubernetes.io/projected/774bfb19-2861-479a-a336-756a6a8d2926-kube-api-access-6gswz\") pod \"multus-6tdx8\" (UID: \"774bfb19-2861-479a-a336-756a6a8d2926\") " pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.712957 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.712935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cclhr\" (UniqueName: \"kubernetes.io/projected/cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee-kube-api-access-cclhr\") pod \"multus-additional-cni-plugins-m2q9r\" (UID: \"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee\") " pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.802072 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802036 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-run-netns\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802072 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c38c27a-adb3-46fb-9409-cec659c7a3c1-ovnkube-config\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-kubelet-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.802303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-device-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.802303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-run-netns\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-kubelet-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.802303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-sys\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.802303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c38c27a-adb3-46fb-9409-cec659c7a3c1-env-overrides\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802293 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-device-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.802303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802305 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-var-lib-kubelet\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.802571 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-slash\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802571 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-sys\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.802571 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-slash\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802571 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802515 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-var-lib-kubelet\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.802571 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-run\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.802758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-lib-modules\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.802758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-run\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.802758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5x4p\" (UniqueName: \"kubernetes.io/projected/43e3018f-2822-49c0-af54-f203162e6017-kube-api-access-d5x4p\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.802758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-run-openvswitch\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802671 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c38c27a-adb3-46fb-9409-cec659c7a3c1-ovnkube-config\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c38c27a-adb3-46fb-9409-cec659c7a3c1-ovnkube-script-lib\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802722 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-log-socket\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.802758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0c2d21f8-03e3-423b-a4e7-4ab1bd770001-agent-certs\") pod \"konnectivity-agent-l85wm\" (UID: \"0c2d21f8-03e3-423b-a4e7-4ab1bd770001\") " pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:20.802758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c38c27a-adb3-46fb-9409-cec659c7a3c1-env-overrides\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/43e3018f-2822-49c0-af54-f203162e6017-etc-tuned\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802786 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-cni-bin\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbc16f1f-e425-42a6-9352-b92e465bc2c2-tmp-dir\") pod \"node-resolver-6k2dk\" (UID: \"bbc16f1f-e425-42a6-9352-b92e465bc2c2\") " pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802808 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-run-openvswitch\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27f9l\" (UniqueName: \"kubernetes.io/projected/bbc16f1f-e425-42a6-9352-b92e465bc2c2-kube-api-access-27f9l\") pod \"node-resolver-6k2dk\" (UID: \"bbc16f1f-e425-42a6-9352-b92e465bc2c2\") " pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z92xn\" (UniqueName: \"kubernetes.io/projected/14ea9252-57e5-4e09-9c9e-33d96e94d56f-kube-api-access-z92xn\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5f9d\" (UniqueName: \"kubernetes.io/projected/72925cbc-8caa-4bb0-8945-2fb6210c31e7-kube-api-access-k5f9d\") pod \"iptables-alerter-fdrzw\" (UID: \"72925cbc-8caa-4bb0-8945-2fb6210c31e7\") " pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-systemd\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-kubelet\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-socket-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-etc-selinux\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-cni-netd\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803013 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c38c27a-adb3-46fb-9409-cec659c7a3c1-ovn-node-metrics-cert\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802850 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-cni-bin\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803039 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-kubelet\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803089 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.802814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-lib-modules\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:20.803141 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-socket-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:20.803220 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs podName:14ea9252-57e5-4e09-9c9e-33d96e94d56f nodeName:}" failed. No retries permitted until 2026-04-20 17:48:21.303202725 +0000 UTC m=+3.088557957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs") pod "network-metrics-daemon-skq27" (UID: "14ea9252-57e5-4e09-9c9e-33d96e94d56f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-etc-selinux\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72925cbc-8caa-4bb0-8945-2fb6210c31e7-host-slash\") pod \"iptables-alerter-fdrzw\" (UID: \"72925cbc-8caa-4bb0-8945-2fb6210c31e7\") " pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803300 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-cni-netd\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-kubernetes\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-run-systemd\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803383 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv4m5\" (UniqueName: \"kubernetes.io/projected/2c38c27a-adb3-46fb-9409-cec659c7a3c1-kube-api-access-rv4m5\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803406 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c38c27a-adb3-46fb-9409-cec659c7a3c1-ovnkube-script-lib\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0c2d21f8-03e3-423b-a4e7-4ab1bd770001-konnectivity-ca\") pod \"konnectivity-agent-l85wm\" (UID: \"0c2d21f8-03e3-423b-a4e7-4ab1bd770001\") " pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-systemd\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-systemd-units\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803487 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-sys-fs\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/72925cbc-8caa-4bb0-8945-2fb6210c31e7-iptables-alerter-script\") pod \"iptables-alerter-fdrzw\" (UID: \"72925cbc-8caa-4bb0-8945-2fb6210c31e7\") " pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.803836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-registration-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-sysctl-d\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-host\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803688 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-var-lib-openvswitch\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-run-ovn\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbc16f1f-e425-42a6-9352-b92e465bc2c2-tmp-dir\") pod \"node-resolver-6k2dk\" (UID: \"bbc16f1f-e425-42a6-9352-b92e465bc2c2\") " pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-sys-fs\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-node-log\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-systemd-units\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0c2d21f8-03e3-423b-a4e7-4ab1bd770001-konnectivity-ca\") pod \"konnectivity-agent-l85wm\" (UID: \"0c2d21f8-03e3-423b-a4e7-4ab1bd770001\") " pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-log-socket\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.803738 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-node-log\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bbc16f1f-e425-42a6-9352-b92e465bc2c2-hosts-file\") pod \"node-resolver-6k2dk\" (UID: \"bbc16f1f-e425-42a6-9352-b92e465bc2c2\") " pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzx7q\" (UniqueName: \"kubernetes.io/projected/b5439cb9-e073-43b0-a160-d1509081e674-kube-api-access-tzx7q\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72925cbc-8caa-4bb0-8945-2fb6210c31e7-host-slash\") pod \"iptables-alerter-fdrzw\" (UID: \"72925cbc-8caa-4bb0-8945-2fb6210c31e7\") " pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43e3018f-2822-49c0-af54-f203162e6017-tmp\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-modprobe-d\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.804614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804273 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-etc-openvswitch\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-sysconfig\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804340 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/72925cbc-8caa-4bb0-8945-2fb6210c31e7-iptables-alerter-script\") pod \"iptables-alerter-fdrzw\" (UID: \"72925cbc-8caa-4bb0-8945-2fb6210c31e7\") " pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-sysctl-conf\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5439cb9-e073-43b0-a160-d1509081e674-registration-dir\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-sysctl-d\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-sysctl-conf\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-kubernetes\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-host\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-run-ovn\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-sysconfig\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-run-systemd\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/43e3018f-2822-49c0-af54-f203162e6017-etc-modprobe-d\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bbc16f1f-e425-42a6-9352-b92e465bc2c2-hosts-file\") pod \"node-resolver-6k2dk\" (UID: \"bbc16f1f-e425-42a6-9352-b92e465bc2c2\") " pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-etc-openvswitch\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804831 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-var-lib-openvswitch\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.804848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c38c27a-adb3-46fb-9409-cec659c7a3c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.805610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.805372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/43e3018f-2822-49c0-af54-f203162e6017-etc-tuned\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.806296 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.805873 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c38c27a-adb3-46fb-9409-cec659c7a3c1-ovn-node-metrics-cert\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.806296 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.805878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0c2d21f8-03e3-423b-a4e7-4ab1bd770001-agent-certs\") pod \"konnectivity-agent-l85wm\" (UID: \"0c2d21f8-03e3-423b-a4e7-4ab1bd770001\") " pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:20.806646 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.806621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43e3018f-2822-49c0-af54-f203162e6017-tmp\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.811594 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.811564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5x4p\" (UniqueName: \"kubernetes.io/projected/43e3018f-2822-49c0-af54-f203162e6017-kube-api-access-d5x4p\") pod \"tuned-2r4zf\" (UID: \"43e3018f-2822-49c0-af54-f203162e6017\") " pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.813495 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.813464 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z92xn\" (UniqueName: \"kubernetes.io/projected/14ea9252-57e5-4e09-9c9e-33d96e94d56f-kube-api-access-z92xn\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:20.813607 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.813467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27f9l\" (UniqueName: \"kubernetes.io/projected/bbc16f1f-e425-42a6-9352-b92e465bc2c2-kube-api-access-27f9l\") pod \"node-resolver-6k2dk\" (UID: \"bbc16f1f-e425-42a6-9352-b92e465bc2c2\") " pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.813841 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.813820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzx7q\" (UniqueName: \"kubernetes.io/projected/b5439cb9-e073-43b0-a160-d1509081e674-kube-api-access-tzx7q\") pod \"aws-ebs-csi-driver-node-648rj\" (UID: \"b5439cb9-e073-43b0-a160-d1509081e674\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.813922 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.813859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv4m5\" (UniqueName: \"kubernetes.io/projected/2c38c27a-adb3-46fb-9409-cec659c7a3c1-kube-api-access-rv4m5\") pod \"ovnkube-node-mxb8f\" (UID: \"2c38c27a-adb3-46fb-9409-cec659c7a3c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.814690 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.814673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5f9d\" (UniqueName: \"kubernetes.io/projected/72925cbc-8caa-4bb0-8945-2fb6210c31e7-kube-api-access-k5f9d\") pod \"iptables-alerter-fdrzw\" (UID: \"72925cbc-8caa-4bb0-8945-2fb6210c31e7\") " pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.893830 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.893745 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6dlf2" Apr 20 17:48:20.902093 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.902072 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6tdx8" Apr 20 17:48:20.909814 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.909788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" Apr 20 17:48:20.915410 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.915388 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6k2dk" Apr 20 17:48:20.922964 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.922945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fdrzw" Apr 20 17:48:20.931592 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.931568 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:20.938179 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.938153 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" Apr 20 17:48:20.944747 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.944725 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" Apr 20 17:48:20.950316 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.950295 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:20.993004 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:20.992966 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:21.207350 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.207268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxj6\" (UniqueName: \"kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6\") pod \"network-check-target-9whfr\" (UID: \"bd203605-978e-4434-8717-9734789f5af0\") " pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:21.207510 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:21.207398 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:21.207510 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:21.207419 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:21.207510 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:21.207434 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxj6 for pod openshift-network-diagnostics/network-check-target-9whfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:21.207510 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:21.207488 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6 podName:bd203605-978e-4434-8717-9734789f5af0 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:22.207473859 +0000 UTC m=+3.992829076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxj6" (UniqueName: "kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6") pod "network-check-target-9whfr" (UID: "bd203605-978e-4434-8717-9734789f5af0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:21.307534 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.307504 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:21.307679 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:21.307652 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:21.307720 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:21.307710 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs podName:14ea9252-57e5-4e09-9c9e-33d96e94d56f nodeName:}" failed. No retries permitted until 2026-04-20 17:48:22.307696386 +0000 UTC m=+4.093051615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs") pod "network-metrics-daemon-skq27" (UID: "14ea9252-57e5-4e09-9c9e-33d96e94d56f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:21.332685 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:21.332654 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf66c5a0_9dc4_4b6b_ac5f_ceb4cd1940ee.slice/crio-13ea80c6a43e331db0e98010a3ad768c8417038fb91c76a62f964fd361aa7a57 WatchSource:0}: Error finding container 13ea80c6a43e331db0e98010a3ad768c8417038fb91c76a62f964fd361aa7a57: Status 404 returned error can't find the container with id 13ea80c6a43e331db0e98010a3ad768c8417038fb91c76a62f964fd361aa7a57 Apr 20 17:48:21.333844 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:21.333817 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5439cb9_e073_43b0_a160_d1509081e674.slice/crio-f3f656e85496a0f8cc046c75cf7940ae8096117a433adfe4363372e82bc96f79 WatchSource:0}: Error finding container f3f656e85496a0f8cc046c75cf7940ae8096117a433adfe4363372e82bc96f79: Status 404 returned error can't find the container with id f3f656e85496a0f8cc046c75cf7940ae8096117a433adfe4363372e82bc96f79 Apr 20 17:48:21.334913 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:21.334886 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b143fe4_9d02_4ed6_a139_f8b9c51e336d.slice/crio-04826dea1894a06a51c38557f8fbc78d3783bca24128b25272ecaacf6ac19139 WatchSource:0}: Error finding container 04826dea1894a06a51c38557f8fbc78d3783bca24128b25272ecaacf6ac19139: Status 404 returned error can't find the container with id 04826dea1894a06a51c38557f8fbc78d3783bca24128b25272ecaacf6ac19139 Apr 20 17:48:21.336365 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:21.336335 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c38c27a_adb3_46fb_9409_cec659c7a3c1.slice/crio-ada9e0ef6a16c39a1fa4488f06c12c97264d3b0842eee61057f550f873012aa2 WatchSource:0}: Error finding container ada9e0ef6a16c39a1fa4488f06c12c97264d3b0842eee61057f550f873012aa2: Status 404 returned error can't find the container with id ada9e0ef6a16c39a1fa4488f06c12c97264d3b0842eee61057f550f873012aa2 Apr 20 17:48:21.339708 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:21.339665 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbc16f1f_e425_42a6_9352_b92e465bc2c2.slice/crio-90a02a880d6099b52aa5d8cc79caf636343fd9322b95cae1657245e994e73242 WatchSource:0}: Error finding container 90a02a880d6099b52aa5d8cc79caf636343fd9322b95cae1657245e994e73242: Status 404 returned error can't find the container with id 90a02a880d6099b52aa5d8cc79caf636343fd9322b95cae1657245e994e73242 Apr 20 17:48:21.340440 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:21.340417 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod774bfb19_2861_479a_a336_756a6a8d2926.slice/crio-843400138af379809f22ec8224a539a7a60bcba2e14a5c0b886435bcb77d1455 WatchSource:0}: Error finding container 843400138af379809f22ec8224a539a7a60bcba2e14a5c0b886435bcb77d1455: Status 404 returned error can't find the container with id 843400138af379809f22ec8224a539a7a60bcba2e14a5c0b886435bcb77d1455 Apr 20 17:48:21.341913 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:21.341894 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c2d21f8_03e3_423b_a4e7_4ab1bd770001.slice/crio-794afed7b60925022687dd9256f58fdd1f2d22fb497ce43483cc1f1ef80b688b WatchSource:0}: Error finding container 794afed7b60925022687dd9256f58fdd1f2d22fb497ce43483cc1f1ef80b688b: Status 404 returned error can't find the container with id 794afed7b60925022687dd9256f58fdd1f2d22fb497ce43483cc1f1ef80b688b Apr 20 17:48:21.343097 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:21.342886 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e3018f_2822_49c0_af54_f203162e6017.slice/crio-690202ce89a53079b68af62029a09c2962fd5dd979b14f1aeb4fc226563f52b3 WatchSource:0}: Error finding container 690202ce89a53079b68af62029a09c2962fd5dd979b14f1aeb4fc226563f52b3: Status 404 returned error can't find the container with id 690202ce89a53079b68af62029a09c2962fd5dd979b14f1aeb4fc226563f52b3 Apr 20 17:48:21.343933 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:21.343838 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72925cbc_8caa_4bb0_8945_2fb6210c31e7.slice/crio-380ffc19bbc6fd6e6c42bb3d0026885a25417fa8bcbe3a5ad23124d7de0920e9 WatchSource:0}: Error finding container 380ffc19bbc6fd6e6c42bb3d0026885a25417fa8bcbe3a5ad23124d7de0920e9: Status 404 returned error can't find the container with id 380ffc19bbc6fd6e6c42bb3d0026885a25417fa8bcbe3a5ad23124d7de0920e9 Apr 20 17:48:21.623649 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.623561 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 17:43:19 +0000 UTC" deadline="2027-09-15 14:04:47.378112618 +0000 UTC" Apr 20 17:48:21.623649 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.623595 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12308h16m25.754520219s" Apr 20 17:48:21.690935 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.690851 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:21.691111 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:21.690973 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:21.701770 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.701733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal" event={"ID":"99727bb875db005dc5ab40f5d2dc2824","Type":"ContainerStarted","Data":"e446d19bcf0c14440776ca9763c36a5a14d9cd47187fc024e19729fd1e97f9dd"} Apr 20 17:48:21.705914 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.705879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" event={"ID":"43e3018f-2822-49c0-af54-f203162e6017","Type":"ContainerStarted","Data":"690202ce89a53079b68af62029a09c2962fd5dd979b14f1aeb4fc226563f52b3"} Apr 20 17:48:21.708143 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.708117 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l85wm" event={"ID":"0c2d21f8-03e3-423b-a4e7-4ab1bd770001","Type":"ContainerStarted","Data":"794afed7b60925022687dd9256f58fdd1f2d22fb497ce43483cc1f1ef80b688b"} Apr 20 17:48:21.710650 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.710623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6tdx8" event={"ID":"774bfb19-2861-479a-a336-756a6a8d2926","Type":"ContainerStarted","Data":"843400138af379809f22ec8224a539a7a60bcba2e14a5c0b886435bcb77d1455"} Apr 20 17:48:21.713768 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.713656 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6dlf2" event={"ID":"7b143fe4-9d02-4ed6-a139-f8b9c51e336d","Type":"ContainerStarted","Data":"04826dea1894a06a51c38557f8fbc78d3783bca24128b25272ecaacf6ac19139"} Apr 20 17:48:21.714373 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.714326 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-49.ec2.internal" podStartSLOduration=2.714312409 podStartE2EDuration="2.714312409s" podCreationTimestamp="2026-04-20 17:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:48:21.713754463 +0000 UTC m=+3.499109714" watchObservedRunningTime="2026-04-20 17:48:21.714312409 +0000 UTC m=+3.499667649" Apr 20 17:48:21.715530 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.715398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" event={"ID":"b5439cb9-e073-43b0-a160-d1509081e674","Type":"ContainerStarted","Data":"f3f656e85496a0f8cc046c75cf7940ae8096117a433adfe4363372e82bc96f79"} Apr 20 17:48:21.717570 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.717532 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fdrzw" event={"ID":"72925cbc-8caa-4bb0-8945-2fb6210c31e7","Type":"ContainerStarted","Data":"380ffc19bbc6fd6e6c42bb3d0026885a25417fa8bcbe3a5ad23124d7de0920e9"} Apr 20 17:48:21.719296 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.719259 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6k2dk" event={"ID":"bbc16f1f-e425-42a6-9352-b92e465bc2c2","Type":"ContainerStarted","Data":"90a02a880d6099b52aa5d8cc79caf636343fd9322b95cae1657245e994e73242"} Apr 20 17:48:21.720354 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.720330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"ada9e0ef6a16c39a1fa4488f06c12c97264d3b0842eee61057f550f873012aa2"} Apr 20 17:48:21.721901 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:21.721873 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" event={"ID":"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee","Type":"ContainerStarted","Data":"13ea80c6a43e331db0e98010a3ad768c8417038fb91c76a62f964fd361aa7a57"} Apr 20 17:48:22.217493 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:22.216702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxj6\" (UniqueName: \"kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6\") pod \"network-check-target-9whfr\" (UID: \"bd203605-978e-4434-8717-9734789f5af0\") " pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:22.217493 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:22.216911 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:22.217493 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:22.216929 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:22.217493 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:22.216942 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxj6 for pod openshift-network-diagnostics/network-check-target-9whfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:22.217493 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:22.217017 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6 podName:bd203605-978e-4434-8717-9734789f5af0 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:24.216999747 +0000 UTC m=+6.002354979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxj6" (UniqueName: "kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6") pod "network-check-target-9whfr" (UID: "bd203605-978e-4434-8717-9734789f5af0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:22.319176 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:22.318512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:22.319176 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:22.318717 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:22.319176 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:22.318814 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs podName:14ea9252-57e5-4e09-9c9e-33d96e94d56f nodeName:}" failed. No retries permitted until 2026-04-20 17:48:24.318780649 +0000 UTC m=+6.104135871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs") pod "network-metrics-daemon-skq27" (UID: "14ea9252-57e5-4e09-9c9e-33d96e94d56f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:22.693607 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:22.693377 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:22.693607 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:22.693525 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:22.737864 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:22.737742 2577 generic.go:358] "Generic (PLEG): container finished" podID="3acadad2e761e85308a5de0dbfa7268b" containerID="ee5d3edc6587a7cee1a619d1a3673e4ac76cfca70078a74e1930768779ea993a" exitCode=0 Apr 20 17:48:22.737864 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:22.737822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" event={"ID":"3acadad2e761e85308a5de0dbfa7268b","Type":"ContainerDied","Data":"ee5d3edc6587a7cee1a619d1a3673e4ac76cfca70078a74e1930768779ea993a"} Apr 20 17:48:23.691229 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:23.690715 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:23.691229 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:23.690851 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:23.754045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:23.753499 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" event={"ID":"3acadad2e761e85308a5de0dbfa7268b","Type":"ContainerStarted","Data":"cd274d768e5692cd189c2f0cc4a6aed91484ec03fbec4ecbd26787ac4d337655"} Apr 20 17:48:23.776524 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:23.775361 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-49.ec2.internal" podStartSLOduration=4.775333472 podStartE2EDuration="4.775333472s" podCreationTimestamp="2026-04-20 17:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:48:23.775217433 +0000 UTC m=+5.560572674" watchObservedRunningTime="2026-04-20 17:48:23.775333472 +0000 UTC m=+5.560688709" Apr 20 17:48:24.237127 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:24.237089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxj6\" (UniqueName: \"kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6\") pod \"network-check-target-9whfr\" (UID: \"bd203605-978e-4434-8717-9734789f5af0\") " pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:24.237314 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:24.237302 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:24.237375 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:24.237324 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:24.237375 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:24.237352 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxj6 for pod openshift-network-diagnostics/network-check-target-9whfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:24.237474 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:24.237428 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6 podName:bd203605-978e-4434-8717-9734789f5af0 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:28.23740604 +0000 UTC m=+10.022761265 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxj6" (UniqueName: "kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6") pod "network-check-target-9whfr" (UID: "bd203605-978e-4434-8717-9734789f5af0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:24.337841 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:24.337784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:24.338045 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:24.337940 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:24.338045 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:24.338023 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs podName:14ea9252-57e5-4e09-9c9e-33d96e94d56f nodeName:}" failed. No retries permitted until 2026-04-20 17:48:28.338004614 +0000 UTC m=+10.123359852 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs") pod "network-metrics-daemon-skq27" (UID: "14ea9252-57e5-4e09-9c9e-33d96e94d56f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:24.693761 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:24.693682 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:24.693916 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:24.693814 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:25.690914 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:25.690855 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:25.691405 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:25.691076 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:26.691610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:26.691557 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:26.692066 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:26.691709 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:27.690744 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:27.690587 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:27.690744 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:27.690722 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:28.272257 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:28.271833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxj6\" (UniqueName: \"kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6\") pod \"network-check-target-9whfr\" (UID: \"bd203605-978e-4434-8717-9734789f5af0\") " pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:28.272257 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:28.272002 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:28.272257 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:28.272027 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:28.272257 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:28.272042 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxj6 for pod openshift-network-diagnostics/network-check-target-9whfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:28.272257 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:28.272110 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6 podName:bd203605-978e-4434-8717-9734789f5af0 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:36.272089059 +0000 UTC m=+18.057444290 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxj6" (UniqueName: "kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6") pod "network-check-target-9whfr" (UID: "bd203605-978e-4434-8717-9734789f5af0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:28.373052 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:28.372416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:28.373052 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:28.372577 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:28.373052 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:28.372638 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs podName:14ea9252-57e5-4e09-9c9e-33d96e94d56f nodeName:}" failed. No retries permitted until 2026-04-20 17:48:36.372619942 +0000 UTC m=+18.157975165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs") pod "network-metrics-daemon-skq27" (UID: "14ea9252-57e5-4e09-9c9e-33d96e94d56f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:28.695836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:28.694567 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:28.695836 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:28.694730 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:29.691345 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.691312 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:29.691794 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:29.691437 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:29.763560 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.763524 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-h7dtn"] Apr 20 17:48:29.766847 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.766705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:29.766847 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:29.766785 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:29.885596 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.885541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:29.885596 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.885598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/542dd1d3-eb84-486c-a8ee-46b247e169f8-dbus\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:29.885820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.885727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/542dd1d3-eb84-486c-a8ee-46b247e169f8-kubelet-config\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:29.986254 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.986165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/542dd1d3-eb84-486c-a8ee-46b247e169f8-kubelet-config\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:29.986411 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.986253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:29.986411 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.986280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/542dd1d3-eb84-486c-a8ee-46b247e169f8-dbus\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:29.986533 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.986440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/542dd1d3-eb84-486c-a8ee-46b247e169f8-kubelet-config\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:29.986533 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:29.986465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/542dd1d3-eb84-486c-a8ee-46b247e169f8-dbus\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:29.986533 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:29.986515 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:29.986672 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:29.986604 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret podName:542dd1d3-eb84-486c-a8ee-46b247e169f8 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:30.486580159 +0000 UTC m=+12.271935390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret") pod "global-pull-secret-syncer-h7dtn" (UID: "542dd1d3-eb84-486c-a8ee-46b247e169f8") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:30.488106 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:30.487950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:30.488267 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:30.488110 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:30.488267 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:30.488189 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret podName:542dd1d3-eb84-486c-a8ee-46b247e169f8 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:31.488168758 +0000 UTC m=+13.273523980 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret") pod "global-pull-secret-syncer-h7dtn" (UID: "542dd1d3-eb84-486c-a8ee-46b247e169f8") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:30.691407 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:30.691370 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:30.691864 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:30.691517 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:31.497330 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:31.497286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:31.497524 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:31.497414 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:31.497524 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:31.497486 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret podName:542dd1d3-eb84-486c-a8ee-46b247e169f8 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:33.497470803 +0000 UTC m=+15.282826034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret") pod "global-pull-secret-syncer-h7dtn" (UID: "542dd1d3-eb84-486c-a8ee-46b247e169f8") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:31.691075 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:31.691043 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:31.691292 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:31.691043 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:31.691292 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:31.691167 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:31.691292 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:31.691241 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:32.691160 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:32.691116 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:32.691603 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:32.691270 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:33.515885 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:33.515847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:33.516139 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:33.516009 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:33.516139 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:33.516093 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret podName:542dd1d3-eb84-486c-a8ee-46b247e169f8 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:37.516071093 +0000 UTC m=+19.301426310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret") pod "global-pull-secret-syncer-h7dtn" (UID: "542dd1d3-eb84-486c-a8ee-46b247e169f8") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:33.691493 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:33.691462 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:33.691856 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:33.691462 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:33.691856 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:33.691570 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:33.691856 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:33.691665 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:34.691020 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:34.690969 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:34.691182 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:34.691144 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:35.690961 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:35.690919 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:35.691428 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:35.690920 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:35.691428 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:35.691081 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:35.691428 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:35.691235 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:36.337905 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:36.337865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxj6\" (UniqueName: \"kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6\") pod \"network-check-target-9whfr\" (UID: \"bd203605-978e-4434-8717-9734789f5af0\") " pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:36.338107 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:36.338083 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:36.338152 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:36.338117 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:36.338152 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:36.338134 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxj6 for pod openshift-network-diagnostics/network-check-target-9whfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:36.338229 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:36.338187 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6 podName:bd203605-978e-4434-8717-9734789f5af0 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:52.338173108 +0000 UTC m=+34.123528324 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxj6" (UniqueName: "kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6") pod "network-check-target-9whfr" (UID: "bd203605-978e-4434-8717-9734789f5af0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:36.441023 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:36.438972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:36.441023 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:36.439241 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:36.441023 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:36.439313 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs podName:14ea9252-57e5-4e09-9c9e-33d96e94d56f nodeName:}" failed. No retries permitted until 2026-04-20 17:48:52.439294129 +0000 UTC m=+34.224649362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs") pod "network-metrics-daemon-skq27" (UID: "14ea9252-57e5-4e09-9c9e-33d96e94d56f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:36.691538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:36.691458 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:36.691962 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:36.691608 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:37.547959 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:37.547920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:37.548126 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:37.548093 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:37.548175 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:37.548169 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret podName:542dd1d3-eb84-486c-a8ee-46b247e169f8 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:45.548146698 +0000 UTC m=+27.333501920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret") pod "global-pull-secret-syncer-h7dtn" (UID: "542dd1d3-eb84-486c-a8ee-46b247e169f8") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:37.690608 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:37.690566 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:37.690771 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:37.690573 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:37.690771 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:37.690680 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:37.690889 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:37.690762 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:38.692404 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.692186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:38.692673 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:38.692511 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:38.783463 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.783424 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" event={"ID":"43e3018f-2822-49c0-af54-f203162e6017","Type":"ContainerStarted","Data":"efbd36a933b6fd45aea60b0aaa7b6690a04f8da1615b41fef609bdce2497ce19"} Apr 20 17:48:38.784676 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.784643 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l85wm" event={"ID":"0c2d21f8-03e3-423b-a4e7-4ab1bd770001","Type":"ContainerStarted","Data":"45ca68433e2d990bac503c3bd681788a4a86e32e004ad40adcb1195982a1dddb"} Apr 20 17:48:38.786234 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.786109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6tdx8" event={"ID":"774bfb19-2861-479a-a336-756a6a8d2926","Type":"ContainerStarted","Data":"4ae4bd5f7361603506e201c13501a81ebabfdc78da2a3f86f6dd370f7fb37210"} Apr 20 17:48:38.788070 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.788043 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6dlf2" event={"ID":"7b143fe4-9d02-4ed6-a139-f8b9c51e336d","Type":"ContainerStarted","Data":"30e36da086102f9ad9eb32a77ff043180cd695f805467c35e5e2aebf147bae9e"} Apr 20 17:48:38.790105 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.790079 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" event={"ID":"b5439cb9-e073-43b0-a160-d1509081e674","Type":"ContainerStarted","Data":"3456ee9dd83ade38c008d5dfe5449a6ed069d2ea75d51928eed420246cb8969f"} Apr 20 17:48:38.791636 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.791604 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6k2dk" event={"ID":"bbc16f1f-e425-42a6-9352-b92e465bc2c2","Type":"ContainerStarted","Data":"af4223bd17525b4579986a603d2efcb941679476782115b146057b1811f8a41c"} Apr 20 17:48:38.793195 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.793173 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"fe6ef971fea0fa02788794cc0a0f69b9395446c68cc3d8fae88b8a174925a4fd"} Apr 20 17:48:38.793279 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.793202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"b5611f7074d0edc55f8b282eadbc774d088d30dc92f80f71fa8ec699ed7aea36"} Apr 20 17:48:38.794496 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.794471 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" event={"ID":"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee","Type":"ContainerStarted","Data":"a4a0025ef93fea5c0094835b9a0f0343d669b3ae813e3556a00cccbd2ac44a47"} Apr 20 17:48:38.833729 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.833688 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6dlf2" podStartSLOduration=11.945056043 podStartE2EDuration="20.833671507s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:21.338832594 +0000 UTC m=+3.124187827" lastFinishedPulling="2026-04-20 17:48:30.227448061 +0000 UTC m=+12.012803291" observedRunningTime="2026-04-20 17:48:38.833647629 +0000 UTC m=+20.619002879" watchObservedRunningTime="2026-04-20 17:48:38.833671507 +0000 UTC m=+20.619026727" Apr 20 17:48:38.834212 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.834187 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2r4zf" podStartSLOduration=3.8177751730000002 podStartE2EDuration="20.834178942s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:21.344789604 +0000 UTC m=+3.130144821" lastFinishedPulling="2026-04-20 17:48:38.361193371 +0000 UTC m=+20.146548590" observedRunningTime="2026-04-20 17:48:38.812684979 +0000 UTC m=+20.598040334" watchObservedRunningTime="2026-04-20 17:48:38.834178942 +0000 UTC m=+20.619534182" Apr 20 17:48:38.863926 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.863882 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6tdx8" podStartSLOduration=3.826840168 podStartE2EDuration="20.863869419s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:21.342335619 +0000 UTC m=+3.127690836" lastFinishedPulling="2026-04-20 17:48:38.379364858 +0000 UTC m=+20.164720087" observedRunningTime="2026-04-20 17:48:38.86351491 +0000 UTC m=+20.648870150" watchObservedRunningTime="2026-04-20 17:48:38.863869419 +0000 UTC m=+20.649224657" Apr 20 17:48:38.902631 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.902581 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6k2dk" podStartSLOduration=3.927476656 podStartE2EDuration="20.902567478s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:21.341333602 +0000 UTC m=+3.126688827" lastFinishedPulling="2026-04-20 17:48:38.316424433 +0000 UTC m=+20.101779649" observedRunningTime="2026-04-20 17:48:38.902013565 +0000 UTC m=+20.687368800" watchObservedRunningTime="2026-04-20 17:48:38.902567478 +0000 UTC m=+20.687922716" Apr 20 17:48:38.920186 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:38.920136 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l85wm" podStartSLOduration=3.905576035 podStartE2EDuration="20.920120088s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:21.344842269 +0000 UTC m=+3.130197501" lastFinishedPulling="2026-04-20 17:48:38.359386327 +0000 UTC m=+20.144741554" observedRunningTime="2026-04-20 17:48:38.919380632 +0000 UTC m=+20.704735871" watchObservedRunningTime="2026-04-20 17:48:38.920120088 +0000 UTC m=+20.705475326" Apr 20 17:48:39.691129 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.690932 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:39.691276 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.690967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:39.691276 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:39.691196 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:39.691341 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:39.691278 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:39.797614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.797577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fdrzw" event={"ID":"72925cbc-8caa-4bb0-8945-2fb6210c31e7","Type":"ContainerStarted","Data":"95a8888124a804f8aa273f3d3c5cd5d8983476fc5c7bcb1b415d3c310489ab77"} Apr 20 17:48:39.800183 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.800160 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 17:48:39.800489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.800461 2577 generic.go:358] "Generic (PLEG): container finished" podID="2c38c27a-adb3-46fb-9409-cec659c7a3c1" containerID="fe6ef971fea0fa02788794cc0a0f69b9395446c68cc3d8fae88b8a174925a4fd" exitCode=1 Apr 20 17:48:39.800568 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.800534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerDied","Data":"fe6ef971fea0fa02788794cc0a0f69b9395446c68cc3d8fae88b8a174925a4fd"} Apr 20 17:48:39.800623 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.800567 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"d8723917fff4eca09b8f43760f97bcd49f35b4554fa6757546c5fdf45b3cb0ec"} Apr 20 17:48:39.800623 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.800581 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"8419b9b9f5a37646cc5356ad50ee2be55531aacd772c51b895126cec396d037f"} Apr 20 17:48:39.800623 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.800596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"8f6fd648fda08c63561d4ff9dde9079d6ad8b64e860444b56528bd73954d0c28"} Apr 20 17:48:39.800623 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.800608 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"cdcf86aee4fa2d139ae87768a13f75dbee9a9f4fbb9a28015a0ad129ff22a5e6"} Apr 20 17:48:39.801825 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.801802 2577 generic.go:358] "Generic (PLEG): container finished" podID="cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee" containerID="a4a0025ef93fea5c0094835b9a0f0343d669b3ae813e3556a00cccbd2ac44a47" exitCode=0 Apr 20 17:48:39.801917 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.801837 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" event={"ID":"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee","Type":"ContainerDied","Data":"a4a0025ef93fea5c0094835b9a0f0343d669b3ae813e3556a00cccbd2ac44a47"} Apr 20 17:48:39.812245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.812208 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fdrzw" podStartSLOduration=4.798182507 podStartE2EDuration="21.812195765s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:21.345683054 +0000 UTC m=+3.131038271" lastFinishedPulling="2026-04-20 17:48:38.359696308 +0000 UTC m=+20.145051529" observedRunningTime="2026-04-20 17:48:39.811503041 +0000 UTC m=+21.596858279" watchObservedRunningTime="2026-04-20 17:48:39.812195765 +0000 UTC m=+21.597551004" Apr 20 17:48:39.950511 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:39.950468 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 17:48:40.664651 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:40.664559 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T17:48:39.950488344Z","UUID":"d92d8ab0-8a8b-48d8-a72e-e558e7137e33","Handler":null,"Name":"","Endpoint":""} Apr 20 17:48:40.668092 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:40.668062 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 17:48:40.668092 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:40.668095 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 17:48:40.691540 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:40.691510 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:40.691688 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:40.691641 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:40.805766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:40.805726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" event={"ID":"b5439cb9-e073-43b0-a160-d1509081e674","Type":"ContainerStarted","Data":"e00d187ae4374aa50572422001464f77dc385c03d78fe036f42df4dc4ecc5736"} Apr 20 17:48:41.398810 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.398780 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:41.399488 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.399470 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:41.690766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.690679 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:41.690766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.690679 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:41.690974 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:41.690805 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:41.690974 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:41.690923 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:41.809188 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.809150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" event={"ID":"b5439cb9-e073-43b0-a160-d1509081e674","Type":"ContainerStarted","Data":"04d98a531e4c1a67cd00032d5eda2f4e7b0bf47cae528b7086d426f4ef794dd9"} Apr 20 17:48:41.812406 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.812388 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 17:48:41.812820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.812787 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"ded17233ec1af977500be426d8419af3bd692c8a223e517584a41110e494794f"} Apr 20 17:48:41.813023 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.812978 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:41.813454 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.813437 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l85wm" Apr 20 17:48:41.826704 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:41.826639 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648rj" podStartSLOduration=3.901294515 podStartE2EDuration="23.826628649s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:21.33839444 +0000 UTC m=+3.123749656" lastFinishedPulling="2026-04-20 17:48:41.263728556 +0000 UTC m=+23.049083790" observedRunningTime="2026-04-20 17:48:41.826461484 +0000 UTC m=+23.611816725" watchObservedRunningTime="2026-04-20 17:48:41.826628649 +0000 UTC m=+23.611983888" Apr 20 17:48:42.690832 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:42.690804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:42.691076 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:42.690938 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:43.691198 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:43.691124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:43.691718 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:43.691124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:43.691718 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:43.691220 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:43.691718 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:43.691307 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:44.691347 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:44.691146 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:44.691939 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:44.691446 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:44.820691 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:44.820660 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 17:48:44.821066 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:44.821029 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"1704c0618d3a2f6d987247160c589f1a0feadf6c89285adc0196c02635b7e8ed"} Apr 20 17:48:44.821460 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:44.821431 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:44.821460 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:44.821461 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:44.821611 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:44.821577 2577 scope.go:117] "RemoveContainer" containerID="fe6ef971fea0fa02788794cc0a0f69b9395446c68cc3d8fae88b8a174925a4fd" Apr 20 17:48:44.822667 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:44.822646 2577 generic.go:358] "Generic (PLEG): container finished" podID="cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee" containerID="ec9276b69d34442bad3df41c1ddf4413ad32b1eb9ed2c1b12363d31f9b7cfed9" exitCode=0 Apr 20 17:48:44.822734 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:44.822676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" event={"ID":"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee","Type":"ContainerDied","Data":"ec9276b69d34442bad3df41c1ddf4413ad32b1eb9ed2c1b12363d31f9b7cfed9"} Apr 20 17:48:44.836450 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:44.836407 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:45.611011 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.610966 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:45.611133 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:45.611102 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:45.611206 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:45.611185 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret podName:542dd1d3-eb84-486c-a8ee-46b247e169f8 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:01.611163331 +0000 UTC m=+43.396518563 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret") pod "global-pull-secret-syncer-h7dtn" (UID: "542dd1d3-eb84-486c-a8ee-46b247e169f8") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:45.691571 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.691533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:45.692022 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.691533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:45.692022 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:45.691665 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:45.692022 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:45.691702 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:45.824444 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.824356 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h7dtn"] Apr 20 17:48:45.827446 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.827412 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-skq27"] Apr 20 17:48:45.827578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.827525 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:45.827639 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:45.827619 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:45.828861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.828841 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 17:48:45.829260 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.829227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" event={"ID":"2c38c27a-adb3-46fb-9409-cec659c7a3c1","Type":"ContainerStarted","Data":"e612d93b44eccd55dca7d7a643ee0fae96fd4d0f016bdc4833dae168473875cd"} Apr 20 17:48:45.829470 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.829454 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:45.831323 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.831302 2577 generic.go:358] "Generic (PLEG): container finished" podID="cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee" containerID="a3233a675fa6f6ae6b49c5e47099242fd9b4d0dd1f620162407d76251e9bbbb1" exitCode=0 Apr 20 17:48:45.831437 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.831361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:45.831437 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.831370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" event={"ID":"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee","Type":"ContainerDied","Data":"a3233a675fa6f6ae6b49c5e47099242fd9b4d0dd1f620162407d76251e9bbbb1"} Apr 20 17:48:45.831543 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:45.831515 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:45.844170 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.844152 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:48:45.849645 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.849614 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9whfr"] Apr 20 17:48:45.849757 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.849703 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:45.849823 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:45.849801 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:45.907677 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:45.907632 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" podStartSLOduration=10.838806934 podStartE2EDuration="27.907615326s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:21.339009814 +0000 UTC m=+3.124365031" lastFinishedPulling="2026-04-20 17:48:38.407818194 +0000 UTC m=+20.193173423" observedRunningTime="2026-04-20 17:48:45.87800802 +0000 UTC m=+27.663363260" watchObservedRunningTime="2026-04-20 17:48:45.907615326 +0000 UTC m=+27.692970604" Apr 20 17:48:46.835264 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:46.835155 2577 generic.go:358] "Generic (PLEG): container finished" podID="cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee" containerID="1caa51d4dd31cb370ace99bb500a9b71e8b55b676c5035ea02bd24e6075f4e38" exitCode=0 Apr 20 17:48:46.835264 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:46.835245 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" event={"ID":"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee","Type":"ContainerDied","Data":"1caa51d4dd31cb370ace99bb500a9b71e8b55b676c5035ea02bd24e6075f4e38"} Apr 20 17:48:47.691501 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:47.691435 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:47.691501 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:47.691487 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:47.691681 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:47.691604 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:47.691732 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:47.691688 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:47.691790 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:47.691756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:47.691867 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:47.691838 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:49.690919 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:49.690730 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:49.691359 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:49.690756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:49.691359 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:49.691050 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h7dtn" podUID="542dd1d3-eb84-486c-a8ee-46b247e169f8" Apr 20 17:48:49.691359 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:49.691129 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9whfr" podUID="bd203605-978e-4434-8717-9734789f5af0" Apr 20 17:48:49.691359 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:49.690764 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:49.691359 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:49.691235 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:48:51.613518 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.613490 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-49.ec2.internal" event="NodeReady" Apr 20 17:48:51.614025 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.613632 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 17:48:51.657187 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.657152 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cxr2b"] Apr 20 17:48:51.661931 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.661905 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hxnms"] Apr 20 17:48:51.662118 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.662093 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.665633 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.665612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:51.666996 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.666959 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 17:48:51.667281 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.667266 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-82cw8\"" Apr 20 17:48:51.672419 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.672291 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 17:48:51.672561 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.672544 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 17:48:51.673538 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.673515 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cxr2b"] Apr 20 17:48:51.675720 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.675690 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hxnms"] Apr 20 17:48:51.675815 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.675747 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 17:48:51.678201 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.678178 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 17:48:51.678437 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.678420 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gkt2t\"" Apr 20 17:48:51.690633 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.690609 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:48:51.690753 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.690638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:51.690753 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.690611 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:51.696409 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.696385 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 17:48:51.697856 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.697004 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 17:48:51.697856 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.697037 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddm2v\"" Apr 20 17:48:51.697856 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.697285 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 17:48:51.702981 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.702960 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-swr7p\"" Apr 20 17:48:51.716048 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.716030 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 17:48:51.758873 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.758842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-config-volume\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.759070 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.758884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h64wb\" (UniqueName: \"kubernetes.io/projected/07e93993-3b0e-409c-9665-f091ef7a8e5a-kube-api-access-h64wb\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:51.759070 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.758978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-tmp-dir\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.759070 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.759031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:51.759237 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.759118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2b6d\" (UniqueName: \"kubernetes.io/projected/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-kube-api-access-h2b6d\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.759237 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.759154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.860113 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.860082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h64wb\" (UniqueName: \"kubernetes.io/projected/07e93993-3b0e-409c-9665-f091ef7a8e5a-kube-api-access-h64wb\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:51.860312 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.860134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-tmp-dir\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.860312 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.860173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:51.860312 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.860233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2b6d\" (UniqueName: \"kubernetes.io/projected/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-kube-api-access-h2b6d\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.860479 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:51.860334 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:51.860479 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:51.860402 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert podName:07e93993-3b0e-409c-9665-f091ef7a8e5a nodeName:}" failed. No retries permitted until 2026-04-20 17:48:52.360382002 +0000 UTC m=+34.145737243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert") pod "ingress-canary-hxnms" (UID: "07e93993-3b0e-409c-9665-f091ef7a8e5a") : secret "canary-serving-cert" not found Apr 20 17:48:51.860479 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.860454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.860635 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.860507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-config-volume\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.860635 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.860526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-tmp-dir\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.860635 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:51.860591 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:51.860758 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:51.860702 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls podName:bbbb6523-e9f1-4c90-a6e2-5288b46c08ad nodeName:}" failed. No retries permitted until 2026-04-20 17:48:52.360683744 +0000 UTC m=+34.146038978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls") pod "dns-default-cxr2b" (UID: "bbbb6523-e9f1-4c90-a6e2-5288b46c08ad") : secret "dns-default-metrics-tls" not found Apr 20 17:48:51.861020 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.861001 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-config-volume\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.871534 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.871478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2b6d\" (UniqueName: \"kubernetes.io/projected/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-kube-api-access-h2b6d\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:51.871731 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:51.871704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h64wb\" (UniqueName: \"kubernetes.io/projected/07e93993-3b0e-409c-9665-f091ef7a8e5a-kube-api-access-h64wb\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:52.365953 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:52.365911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:52.365953 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:52.365960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxj6\" (UniqueName: \"kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6\") pod \"network-check-target-9whfr\" (UID: \"bd203605-978e-4434-8717-9734789f5af0\") " pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:52.366198 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:52.366014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:52.366198 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:52.366092 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:52.366198 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:52.366120 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:52.366198 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:52.366162 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert podName:07e93993-3b0e-409c-9665-f091ef7a8e5a nodeName:}" failed. No retries permitted until 2026-04-20 17:48:53.366145609 +0000 UTC m=+35.151500844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert") pod "ingress-canary-hxnms" (UID: "07e93993-3b0e-409c-9665-f091ef7a8e5a") : secret "canary-serving-cert" not found Apr 20 17:48:52.366198 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:52.366176 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls podName:bbbb6523-e9f1-4c90-a6e2-5288b46c08ad nodeName:}" failed. No retries permitted until 2026-04-20 17:48:53.366170272 +0000 UTC m=+35.151525488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls") pod "dns-default-cxr2b" (UID: "bbbb6523-e9f1-4c90-a6e2-5288b46c08ad") : secret "dns-default-metrics-tls" not found Apr 20 17:48:52.368484 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:52.368460 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxj6\" (UniqueName: \"kubernetes.io/projected/bd203605-978e-4434-8717-9734789f5af0-kube-api-access-rhxj6\") pod \"network-check-target-9whfr\" (UID: \"bd203605-978e-4434-8717-9734789f5af0\") " pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:52.467112 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:52.467071 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:48:52.467291 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:52.467256 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 17:48:52.467360 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:52.467338 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs podName:14ea9252-57e5-4e09-9c9e-33d96e94d56f nodeName:}" failed. No retries permitted until 2026-04-20 17:49:24.467317598 +0000 UTC m=+66.252672831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs") pod "network-metrics-daemon-skq27" (UID: "14ea9252-57e5-4e09-9c9e-33d96e94d56f") : secret "metrics-daemon-secret" not found Apr 20 17:48:52.615272 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:52.615204 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:52.903981 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:52.903769 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9whfr"] Apr 20 17:48:52.914911 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:48:52.914878 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd203605_978e_4434_8717_9734789f5af0.slice/crio-777d7d8042cf890934a2f5e9fafebd2d12c13cc60a789f63d3856aafe11e710f WatchSource:0}: Error finding container 777d7d8042cf890934a2f5e9fafebd2d12c13cc60a789f63d3856aafe11e710f: Status 404 returned error can't find the container with id 777d7d8042cf890934a2f5e9fafebd2d12c13cc60a789f63d3856aafe11e710f Apr 20 17:48:53.373761 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:53.373717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:53.373921 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:53.373790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:53.373921 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:53.373833 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:53.373921 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:53.373877 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:53.373921 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:53.373904 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls podName:bbbb6523-e9f1-4c90-a6e2-5288b46c08ad nodeName:}" failed. No retries permitted until 2026-04-20 17:48:55.37388369 +0000 UTC m=+37.159238911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls") pod "dns-default-cxr2b" (UID: "bbbb6523-e9f1-4c90-a6e2-5288b46c08ad") : secret "dns-default-metrics-tls" not found Apr 20 17:48:53.374084 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:53.373925 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert podName:07e93993-3b0e-409c-9665-f091ef7a8e5a nodeName:}" failed. No retries permitted until 2026-04-20 17:48:55.373914331 +0000 UTC m=+37.159269573 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert") pod "ingress-canary-hxnms" (UID: "07e93993-3b0e-409c-9665-f091ef7a8e5a") : secret "canary-serving-cert" not found Apr 20 17:48:53.855634 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:53.855596 2577 generic.go:358] "Generic (PLEG): container finished" podID="cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee" containerID="26aeb76433d9f5009f1ec1fae8bf23bb8cd863ae79cc4c84b1a52fc523c20b03" exitCode=0 Apr 20 17:48:53.856156 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:53.855684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" event={"ID":"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee","Type":"ContainerDied","Data":"26aeb76433d9f5009f1ec1fae8bf23bb8cd863ae79cc4c84b1a52fc523c20b03"} Apr 20 17:48:53.857097 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:53.857046 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9whfr" event={"ID":"bd203605-978e-4434-8717-9734789f5af0","Type":"ContainerStarted","Data":"777d7d8042cf890934a2f5e9fafebd2d12c13cc60a789f63d3856aafe11e710f"} Apr 20 17:48:54.862696 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:54.862652 2577 generic.go:358] "Generic (PLEG): container finished" podID="cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee" containerID="565648cef7c0ebf14f7948bf579d52ac93dec9fbbd6b3e8c9123d208c45f0226" exitCode=0 Apr 20 17:48:54.862696 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:54.862697 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" event={"ID":"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee","Type":"ContainerDied","Data":"565648cef7c0ebf14f7948bf579d52ac93dec9fbbd6b3e8c9123d208c45f0226"} Apr 20 17:48:55.392496 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:55.392448 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:55.392678 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:55.392535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:55.392678 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:55.392625 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:55.392678 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:55.392657 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:55.392851 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:55.392706 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert podName:07e93993-3b0e-409c-9665-f091ef7a8e5a nodeName:}" failed. No retries permitted until 2026-04-20 17:48:59.392684921 +0000 UTC m=+41.178040138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert") pod "ingress-canary-hxnms" (UID: "07e93993-3b0e-409c-9665-f091ef7a8e5a") : secret "canary-serving-cert" not found Apr 20 17:48:55.392851 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:55.392725 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls podName:bbbb6523-e9f1-4c90-a6e2-5288b46c08ad nodeName:}" failed. No retries permitted until 2026-04-20 17:48:59.392715962 +0000 UTC m=+41.178071184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls") pod "dns-default-cxr2b" (UID: "bbbb6523-e9f1-4c90-a6e2-5288b46c08ad") : secret "dns-default-metrics-tls" not found Apr 20 17:48:55.867608 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:55.867419 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" event={"ID":"cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee","Type":"ContainerStarted","Data":"4110d098441910f93be75cd53943203cf39b84de9b2ca17147afbf086d2bff79"} Apr 20 17:48:55.868737 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:55.868716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9whfr" event={"ID":"bd203605-978e-4434-8717-9734789f5af0","Type":"ContainerStarted","Data":"be0ed63caf1e96d75cbed459cc6df9db5f9f4a0a8d733f8d36c17a66ea2b5efa"} Apr 20 17:48:55.868835 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:55.868825 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:48:55.893992 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:55.893919 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m2q9r" podStartSLOduration=6.465547077 podStartE2EDuration="37.893904788s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:21.334362037 +0000 UTC m=+3.119717261" lastFinishedPulling="2026-04-20 17:48:52.762719752 +0000 UTC m=+34.548074972" observedRunningTime="2026-04-20 17:48:55.892436879 +0000 UTC m=+37.677792118" watchObservedRunningTime="2026-04-20 17:48:55.893904788 +0000 UTC m=+37.679260027" Apr 20 17:48:55.908159 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:55.908116 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9whfr" podStartSLOduration=35.085807066 podStartE2EDuration="37.908101631s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:52.917097475 +0000 UTC m=+34.702452697" lastFinishedPulling="2026-04-20 17:48:55.739392045 +0000 UTC m=+37.524747262" observedRunningTime="2026-04-20 17:48:55.907792125 +0000 UTC m=+37.693147377" watchObservedRunningTime="2026-04-20 17:48:55.908101631 +0000 UTC m=+37.693456872" Apr 20 17:48:59.419644 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:59.419603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:48:59.419644 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:48:59.419667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:48:59.420186 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:59.419750 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:59.420186 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:59.419752 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:59.420186 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:59.419801 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert podName:07e93993-3b0e-409c-9665-f091ef7a8e5a nodeName:}" failed. No retries permitted until 2026-04-20 17:49:07.419788141 +0000 UTC m=+49.205143357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert") pod "ingress-canary-hxnms" (UID: "07e93993-3b0e-409c-9665-f091ef7a8e5a") : secret "canary-serving-cert" not found Apr 20 17:48:59.420186 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:48:59.419813 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls podName:bbbb6523-e9f1-4c90-a6e2-5288b46c08ad nodeName:}" failed. No retries permitted until 2026-04-20 17:49:07.419807577 +0000 UTC m=+49.205162793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls") pod "dns-default-cxr2b" (UID: "bbbb6523-e9f1-4c90-a6e2-5288b46c08ad") : secret "dns-default-metrics-tls" not found Apr 20 17:49:01.634640 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:01.634593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:49:01.638550 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:01.638514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/542dd1d3-eb84-486c-a8ee-46b247e169f8-original-pull-secret\") pod \"global-pull-secret-syncer-h7dtn\" (UID: \"542dd1d3-eb84-486c-a8ee-46b247e169f8\") " pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:49:01.901141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:01.901056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h7dtn" Apr 20 17:49:02.014912 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:02.014882 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h7dtn"] Apr 20 17:49:02.018322 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:49:02.018287 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542dd1d3_eb84_486c_a8ee_46b247e169f8.slice/crio-6b883c5fa18465791c2fe5fbd0d0417355e4b45b15d78bf7e4bd331dcb6ffdfb WatchSource:0}: Error finding container 6b883c5fa18465791c2fe5fbd0d0417355e4b45b15d78bf7e4bd331dcb6ffdfb: Status 404 returned error can't find the container with id 6b883c5fa18465791c2fe5fbd0d0417355e4b45b15d78bf7e4bd331dcb6ffdfb Apr 20 17:49:02.883454 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:02.883411 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h7dtn" event={"ID":"542dd1d3-eb84-486c-a8ee-46b247e169f8","Type":"ContainerStarted","Data":"6b883c5fa18465791c2fe5fbd0d0417355e4b45b15d78bf7e4bd331dcb6ffdfb"} Apr 20 17:49:06.893244 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:06.893211 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h7dtn" event={"ID":"542dd1d3-eb84-486c-a8ee-46b247e169f8","Type":"ContainerStarted","Data":"7e228fed10a388d8019a43796dcc5cfdbe65d88153244e494e874268cea79b61"} Apr 20 17:49:06.908629 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:06.908585 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-h7dtn" podStartSLOduration=34.033553715 podStartE2EDuration="37.908570984s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:49:02.020030455 +0000 UTC m=+43.805385673" lastFinishedPulling="2026-04-20 17:49:05.895047715 +0000 UTC m=+47.680402942" observedRunningTime="2026-04-20 17:49:06.908040448 +0000 UTC m=+48.693395700" watchObservedRunningTime="2026-04-20 17:49:06.908570984 +0000 UTC m=+48.693926222" Apr 20 17:49:07.475699 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:07.475662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:49:07.476011 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:07.475715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:49:07.476011 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:07.475834 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:07.476011 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:07.475877 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:07.476011 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:07.475921 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls podName:bbbb6523-e9f1-4c90-a6e2-5288b46c08ad nodeName:}" failed. No retries permitted until 2026-04-20 17:49:23.475898176 +0000 UTC m=+65.261253410 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls") pod "dns-default-cxr2b" (UID: "bbbb6523-e9f1-4c90-a6e2-5288b46c08ad") : secret "dns-default-metrics-tls" not found Apr 20 17:49:07.476011 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:07.475943 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert podName:07e93993-3b0e-409c-9665-f091ef7a8e5a nodeName:}" failed. No retries permitted until 2026-04-20 17:49:23.47593222 +0000 UTC m=+65.261287456 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert") pod "ingress-canary-hxnms" (UID: "07e93993-3b0e-409c-9665-f091ef7a8e5a") : secret "canary-serving-cert" not found Apr 20 17:49:17.849137 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:17.849110 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxb8f" Apr 20 17:49:23.488926 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:23.488874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:49:23.488926 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:23.488943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:49:23.489504 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:23.489055 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:23.489504 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:23.489065 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:23.489504 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:23.489115 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls podName:bbbb6523-e9f1-4c90-a6e2-5288b46c08ad nodeName:}" failed. No retries permitted until 2026-04-20 17:49:55.489101687 +0000 UTC m=+97.274456904 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls") pod "dns-default-cxr2b" (UID: "bbbb6523-e9f1-4c90-a6e2-5288b46c08ad") : secret "dns-default-metrics-tls" not found Apr 20 17:49:23.489504 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:23.489155 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert podName:07e93993-3b0e-409c-9665-f091ef7a8e5a nodeName:}" failed. No retries permitted until 2026-04-20 17:49:55.489134852 +0000 UTC m=+97.274490086 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert") pod "ingress-canary-hxnms" (UID: "07e93993-3b0e-409c-9665-f091ef7a8e5a") : secret "canary-serving-cert" not found Apr 20 17:49:24.494567 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:24.494507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:49:24.494963 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:24.494654 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 17:49:24.494963 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:24.494731 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs podName:14ea9252-57e5-4e09-9c9e-33d96e94d56f nodeName:}" failed. No retries permitted until 2026-04-20 17:50:28.494715322 +0000 UTC m=+130.280070539 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs") pod "network-metrics-daemon-skq27" (UID: "14ea9252-57e5-4e09-9c9e-33d96e94d56f") : secret "metrics-daemon-secret" not found Apr 20 17:49:26.874039 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:26.873955 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9whfr" Apr 20 17:49:55.508497 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:55.508461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:49:55.508871 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:49:55.508514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:49:55.508871 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:55.508607 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:55.508871 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:55.508610 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:55.508871 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:55.508677 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert podName:07e93993-3b0e-409c-9665-f091ef7a8e5a nodeName:}" failed. No retries permitted until 2026-04-20 17:50:59.508662516 +0000 UTC m=+161.294017738 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert") pod "ingress-canary-hxnms" (UID: "07e93993-3b0e-409c-9665-f091ef7a8e5a") : secret "canary-serving-cert" not found Apr 20 17:49:55.508871 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:49:55.508691 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls podName:bbbb6523-e9f1-4c90-a6e2-5288b46c08ad nodeName:}" failed. No retries permitted until 2026-04-20 17:50:59.508685432 +0000 UTC m=+161.294040649 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls") pod "dns-default-cxr2b" (UID: "bbbb6523-e9f1-4c90-a6e2-5288b46c08ad") : secret "dns-default-metrics-tls" not found Apr 20 17:50:28.531785 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:28.531728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:50:28.532325 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:28.531874 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 17:50:28.532325 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:28.531949 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs podName:14ea9252-57e5-4e09-9c9e-33d96e94d56f nodeName:}" failed. No retries permitted until 2026-04-20 17:52:30.531933272 +0000 UTC m=+252.317288489 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs") pod "network-metrics-daemon-skq27" (UID: "14ea9252-57e5-4e09-9c9e-33d96e94d56f") : secret "metrics-daemon-secret" not found Apr 20 17:50:39.080363 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.080327 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s"] Apr 20 17:50:39.083178 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.083160 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.085490 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.085469 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 17:50:39.085669 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.085651 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 17:50:39.085736 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.085722 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gt955\"" Apr 20 17:50:39.089577 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.089552 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 17:50:39.091703 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.091679 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 17:50:39.092105 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.092087 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l"] Apr 20 17:50:39.094929 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.094915 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:39.097432 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.097417 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 17:50:39.097524 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.097507 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fvsvp\"" Apr 20 17:50:39.097564 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.097546 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:50:39.097599 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.097563 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 17:50:39.106043 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.106021 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s"] Apr 20 17:50:39.121234 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.121213 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l"] Apr 20 17:50:39.171353 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.171326 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k"] Apr 20 17:50:39.174152 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.174135 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k" Apr 20 17:50:39.178758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.178735 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds"] Apr 20 17:50:39.178870 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.178778 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-hgk8v\"" Apr 20 17:50:39.181382 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.181363 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8"] Apr 20 17:50:39.181500 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.181485 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.184365 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.184342 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5m2k9"] Apr 20 17:50:39.184478 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.184457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.184857 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.184839 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 17:50:39.184956 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.184859 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:50:39.184956 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.184915 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 17:50:39.185085 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.185051 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-pdmjl\"" Apr 20 17:50:39.185085 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.185081 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 17:50:39.187210 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.187192 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.188602 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.188185 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k"] Apr 20 17:50:39.188602 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.188205 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 17:50:39.188602 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.188273 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 17:50:39.188602 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.188329 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 17:50:39.189749 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.189732 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 17:50:39.190712 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.190694 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-t4nrf\"" Apr 20 17:50:39.190865 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.190770 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 17:50:39.190865 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.190795 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 17:50:39.190865 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.190849 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pjdzw\"" Apr 20 17:50:39.191137 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.191119 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 17:50:39.191641 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.191622 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 17:50:39.196240 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.196219 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 17:50:39.199713 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.199692 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8"] Apr 20 17:50:39.202002 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.201963 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds"] Apr 20 17:50:39.202362 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-trusted-ca\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.202487 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlgh8\" (UniqueName: \"kubernetes.io/projected/3bdd8663-80e7-4491-bb84-1d576b9b56cc-kube-api-access-qlgh8\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:39.202580 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.202580 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202529 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-registry-certificates\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.202580 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202574 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:39.202737 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fef058a6-f6d9-40db-91c2-06542c24608b-ca-trust-extracted\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.202737 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-installation-pull-secrets\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.202737 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202700 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pt2v\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-kube-api-access-2pt2v\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.202884 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-image-registry-private-configuration\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.202884 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-bound-sa-token\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.202884 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.202871 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5m2k9"] Apr 20 17:50:39.303945 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.303919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2230508-9373-4dc2-b94b-53c90c805046-serving-cert\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.304076 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.303954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-registry-certificates\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.304076 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.303978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:39.304076 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-installation-pull-secrets\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.304250 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.304106 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 17:50:39.304250 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pt2v\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-kube-api-access-2pt2v\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.304250 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.304183 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls podName:3bdd8663-80e7-4491-bb84-1d576b9b56cc nodeName:}" failed. No retries permitted until 2026-04-20 17:50:39.80416164 +0000 UTC m=+141.589516876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-hr49l" (UID: "3bdd8663-80e7-4491-bb84-1d576b9b56cc") : secret "samples-operator-tls" not found Apr 20 17:50:39.304250 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c75hb\" (UniqueName: \"kubernetes.io/projected/f2230508-9373-4dc2-b94b-53c90c805046-kube-api-access-c75hb\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.304460 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/50c85d10-1859-45c6-8b15-0c1dfc8e482e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.304460 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-image-registry-private-configuration\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.304460 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.304460 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlgh8\" (UniqueName: \"kubernetes.io/projected/3bdd8663-80e7-4491-bb84-1d576b9b56cc-kube-api-access-qlgh8\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:39.304460 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qfxb\" (UniqueName: \"kubernetes.io/projected/cffda971-4978-4e03-9ca8-a40379cc6cf6-kube-api-access-2qfxb\") pod \"network-check-source-8894fc9bd-92b5k\" (UID: \"cffda971-4978-4e03-9ca8-a40379cc6cf6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k" Apr 20 17:50:39.304460 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304429 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09c964e-a54b-444a-bc53-320e8e6cabe7-config\") pod \"service-ca-operator-d6fc45fc5-kpmds\" (UID: \"b09c964e-a54b-444a-bc53-320e8e6cabe7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304504 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8jqz\" (UniqueName: \"kubernetes.io/projected/50c85d10-1859-45c6-8b15-0c1dfc8e482e-kube-api-access-t8jqz\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9s9q\" (UniqueName: \"kubernetes.io/projected/b09c964e-a54b-444a-bc53-320e8e6cabe7-kube-api-access-m9s9q\") pod \"service-ca-operator-d6fc45fc5-kpmds\" (UID: \"b09c964e-a54b-444a-bc53-320e8e6cabe7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.304545 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-registry-certificates\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.304559 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s: secret "image-registry-tls" not found Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2230508-9373-4dc2-b94b-53c90c805046-snapshots\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.304600 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls podName:fef058a6-f6d9-40db-91c2-06542c24608b nodeName:}" failed. No retries permitted until 2026-04-20 17:50:39.804585761 +0000 UTC m=+141.589940986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls") pod "image-registry-6bdc4cb5c9-tnj8s" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b") : secret "image-registry-tls" not found Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fef058a6-f6d9-40db-91c2-06542c24608b-ca-trust-extracted\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-bound-sa-token\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2230508-9373-4dc2-b94b-53c90c805046-tmp\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.304793 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304752 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09c964e-a54b-444a-bc53-320e8e6cabe7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kpmds\" (UID: \"b09c964e-a54b-444a-bc53-320e8e6cabe7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.305245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-trusted-ca\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.305245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304887 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2230508-9373-4dc2-b94b-53c90c805046-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.305245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2230508-9373-4dc2-b94b-53c90c805046-service-ca-bundle\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.305245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.304968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fef058a6-f6d9-40db-91c2-06542c24608b-ca-trust-extracted\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.305631 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.305614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-trusted-ca\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.306406 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.306387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-installation-pull-secrets\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.306501 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.306483 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-image-registry-private-configuration\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.316676 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.316650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-bound-sa-token\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.324670 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.324651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pt2v\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-kube-api-access-2pt2v\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.324964 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.324947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlgh8\" (UniqueName: \"kubernetes.io/projected/3bdd8663-80e7-4491-bb84-1d576b9b56cc-kube-api-access-qlgh8\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:39.405950 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.405893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c75hb\" (UniqueName: \"kubernetes.io/projected/f2230508-9373-4dc2-b94b-53c90c805046-kube-api-access-c75hb\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.405950 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.405920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/50c85d10-1859-45c6-8b15-0c1dfc8e482e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.405950 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.405941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.406172 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.405961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qfxb\" (UniqueName: \"kubernetes.io/projected/cffda971-4978-4e03-9ca8-a40379cc6cf6-kube-api-access-2qfxb\") pod \"network-check-source-8894fc9bd-92b5k\" (UID: \"cffda971-4978-4e03-9ca8-a40379cc6cf6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k" Apr 20 17:50:39.406172 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.405978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09c964e-a54b-444a-bc53-320e8e6cabe7-config\") pod \"service-ca-operator-d6fc45fc5-kpmds\" (UID: \"b09c964e-a54b-444a-bc53-320e8e6cabe7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.406172 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8jqz\" (UniqueName: \"kubernetes.io/projected/50c85d10-1859-45c6-8b15-0c1dfc8e482e-kube-api-access-t8jqz\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.406172 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9s9q\" (UniqueName: \"kubernetes.io/projected/b09c964e-a54b-444a-bc53-320e8e6cabe7-kube-api-access-m9s9q\") pod \"service-ca-operator-d6fc45fc5-kpmds\" (UID: \"b09c964e-a54b-444a-bc53-320e8e6cabe7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.406172 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2230508-9373-4dc2-b94b-53c90c805046-snapshots\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.406172 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.406075 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:39.406172 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2230508-9373-4dc2-b94b-53c90c805046-tmp\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.406172 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.406162 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls podName:50c85d10-1859-45c6-8b15-0c1dfc8e482e nodeName:}" failed. No retries permitted until 2026-04-20 17:50:39.906141909 +0000 UTC m=+141.691497126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wnkt8" (UID: "50c85d10-1859-45c6-8b15-0c1dfc8e482e") : secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:39.406610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09c964e-a54b-444a-bc53-320e8e6cabe7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kpmds\" (UID: \"b09c964e-a54b-444a-bc53-320e8e6cabe7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.406610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2230508-9373-4dc2-b94b-53c90c805046-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.406610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2230508-9373-4dc2-b94b-53c90c805046-service-ca-bundle\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.406610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2230508-9373-4dc2-b94b-53c90c805046-serving-cert\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.406819 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2230508-9373-4dc2-b94b-53c90c805046-tmp\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.406819 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406747 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09c964e-a54b-444a-bc53-320e8e6cabe7-config\") pod \"service-ca-operator-d6fc45fc5-kpmds\" (UID: \"b09c964e-a54b-444a-bc53-320e8e6cabe7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.406819 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.406773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2230508-9373-4dc2-b94b-53c90c805046-snapshots\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.407069 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.407045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/50c85d10-1859-45c6-8b15-0c1dfc8e482e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.407182 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.407163 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2230508-9373-4dc2-b94b-53c90c805046-service-ca-bundle\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.407331 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.407314 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2230508-9373-4dc2-b94b-53c90c805046-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.408421 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.408402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09c964e-a54b-444a-bc53-320e8e6cabe7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kpmds\" (UID: \"b09c964e-a54b-444a-bc53-320e8e6cabe7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.408773 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.408751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2230508-9373-4dc2-b94b-53c90c805046-serving-cert\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.418326 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.418306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9s9q\" (UniqueName: \"kubernetes.io/projected/b09c964e-a54b-444a-bc53-320e8e6cabe7-kube-api-access-m9s9q\") pod \"service-ca-operator-d6fc45fc5-kpmds\" (UID: \"b09c964e-a54b-444a-bc53-320e8e6cabe7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.418401 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.418348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c75hb\" (UniqueName: \"kubernetes.io/projected/f2230508-9373-4dc2-b94b-53c90c805046-kube-api-access-c75hb\") pod \"insights-operator-585dfdc468-5m2k9\" (UID: \"f2230508-9373-4dc2-b94b-53c90c805046\") " pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.418709 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.418686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qfxb\" (UniqueName: \"kubernetes.io/projected/cffda971-4978-4e03-9ca8-a40379cc6cf6-kube-api-access-2qfxb\") pod \"network-check-source-8894fc9bd-92b5k\" (UID: \"cffda971-4978-4e03-9ca8-a40379cc6cf6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k" Apr 20 17:50:39.419177 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.419161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8jqz\" (UniqueName: \"kubernetes.io/projected/50c85d10-1859-45c6-8b15-0c1dfc8e482e-kube-api-access-t8jqz\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.483991 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.483970 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k" Apr 20 17:50:39.491611 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.491595 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" Apr 20 17:50:39.505291 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.505270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5m2k9" Apr 20 17:50:39.625901 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.625871 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k"] Apr 20 17:50:39.628842 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:50:39.628815 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcffda971_4978_4e03_9ca8_a40379cc6cf6.slice/crio-792744e8085381c45c61d93f69623c4a6a5bc34884a5a275cd709ce5fd3683e9 WatchSource:0}: Error finding container 792744e8085381c45c61d93f69623c4a6a5bc34884a5a275cd709ce5fd3683e9: Status 404 returned error can't find the container with id 792744e8085381c45c61d93f69623c4a6a5bc34884a5a275cd709ce5fd3683e9 Apr 20 17:50:39.809405 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.809367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:39.809576 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.809426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:39.809576 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.809501 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 17:50:39.809576 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.809523 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:50:39.809576 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.809533 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s: secret "image-registry-tls" not found Apr 20 17:50:39.809576 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.809578 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls podName:fef058a6-f6d9-40db-91c2-06542c24608b nodeName:}" failed. No retries permitted until 2026-04-20 17:50:40.809564159 +0000 UTC m=+142.594919376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls") pod "image-registry-6bdc4cb5c9-tnj8s" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b") : secret "image-registry-tls" not found Apr 20 17:50:39.809752 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.809590 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls podName:3bdd8663-80e7-4491-bb84-1d576b9b56cc nodeName:}" failed. No retries permitted until 2026-04-20 17:50:40.809584107 +0000 UTC m=+142.594939324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-hr49l" (UID: "3bdd8663-80e7-4491-bb84-1d576b9b56cc") : secret "samples-operator-tls" not found Apr 20 17:50:39.844122 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.844097 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds"] Apr 20 17:50:39.847014 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:50:39.846958 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb09c964e_a54b_444a_bc53_320e8e6cabe7.slice/crio-76f9cbfe00276dfbefc7c2f0f12a1ba50abc8c00ab5e1140da8c805e0429fe6f WatchSource:0}: Error finding container 76f9cbfe00276dfbefc7c2f0f12a1ba50abc8c00ab5e1140da8c805e0429fe6f: Status 404 returned error can't find the container with id 76f9cbfe00276dfbefc7c2f0f12a1ba50abc8c00ab5e1140da8c805e0429fe6f Apr 20 17:50:39.847859 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.847804 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5m2k9"] Apr 20 17:50:39.851882 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:50:39.851859 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2230508_9373_4dc2_b94b_53c90c805046.slice/crio-cfe0bf3739ff008f339b95646171b439115cd58e9b8a4a1296d00c2a3cb849d9 WatchSource:0}: Error finding container cfe0bf3739ff008f339b95646171b439115cd58e9b8a4a1296d00c2a3cb849d9: Status 404 returned error can't find the container with id cfe0bf3739ff008f339b95646171b439115cd58e9b8a4a1296d00c2a3cb849d9 Apr 20 17:50:39.909769 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:39.909706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:39.909866 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.909836 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:39.909915 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:39.909888 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls podName:50c85d10-1859-45c6-8b15-0c1dfc8e482e nodeName:}" failed. No retries permitted until 2026-04-20 17:50:40.909873881 +0000 UTC m=+142.695229098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wnkt8" (UID: "50c85d10-1859-45c6-8b15-0c1dfc8e482e") : secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:40.070706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:40.070671 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5m2k9" event={"ID":"f2230508-9373-4dc2-b94b-53c90c805046","Type":"ContainerStarted","Data":"cfe0bf3739ff008f339b95646171b439115cd58e9b8a4a1296d00c2a3cb849d9"} Apr 20 17:50:40.071713 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:40.071683 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" event={"ID":"b09c964e-a54b-444a-bc53-320e8e6cabe7","Type":"ContainerStarted","Data":"76f9cbfe00276dfbefc7c2f0f12a1ba50abc8c00ab5e1140da8c805e0429fe6f"} Apr 20 17:50:40.072895 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:40.072873 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k" event={"ID":"cffda971-4978-4e03-9ca8-a40379cc6cf6","Type":"ContainerStarted","Data":"1d7a024541d9aba50bc037735cc77a20bfd6fb8907773d83dc59b59c9fafcbb2"} Apr 20 17:50:40.072999 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:40.072902 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k" event={"ID":"cffda971-4978-4e03-9ca8-a40379cc6cf6","Type":"ContainerStarted","Data":"792744e8085381c45c61d93f69623c4a6a5bc34884a5a275cd709ce5fd3683e9"} Apr 20 17:50:40.102536 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:40.102499 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-92b5k" podStartSLOduration=1.102486892 podStartE2EDuration="1.102486892s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:50:40.102349882 +0000 UTC m=+141.887705132" watchObservedRunningTime="2026-04-20 17:50:40.102486892 +0000 UTC m=+141.887842131" Apr 20 17:50:40.815467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:40.815428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:40.815672 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:40.815508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:40.815672 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:40.815648 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:50:40.815672 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:40.815655 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 17:50:40.815835 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:40.815740 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls podName:3bdd8663-80e7-4491-bb84-1d576b9b56cc nodeName:}" failed. No retries permitted until 2026-04-20 17:50:42.815718963 +0000 UTC m=+144.601074193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-hr49l" (UID: "3bdd8663-80e7-4491-bb84-1d576b9b56cc") : secret "samples-operator-tls" not found Apr 20 17:50:40.815835 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:40.815663 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s: secret "image-registry-tls" not found Apr 20 17:50:40.815937 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:40.815841 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls podName:fef058a6-f6d9-40db-91c2-06542c24608b nodeName:}" failed. No retries permitted until 2026-04-20 17:50:42.815824884 +0000 UTC m=+144.601180105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls") pod "image-registry-6bdc4cb5c9-tnj8s" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b") : secret "image-registry-tls" not found Apr 20 17:50:40.916482 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:40.916337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:40.916482 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:40.916414 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:40.916482 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:40.916488 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls podName:50c85d10-1859-45c6-8b15-0c1dfc8e482e nodeName:}" failed. No retries permitted until 2026-04-20 17:50:42.916468639 +0000 UTC m=+144.701823863 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wnkt8" (UID: "50c85d10-1859-45c6-8b15-0c1dfc8e482e") : secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:42.831318 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:42.831278 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:42.831743 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:42.831437 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:50:42.831743 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:42.831461 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s: secret "image-registry-tls" not found Apr 20 17:50:42.831743 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:42.831492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:42.831743 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:42.831519 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls podName:fef058a6-f6d9-40db-91c2-06542c24608b nodeName:}" failed. No retries permitted until 2026-04-20 17:50:46.831500711 +0000 UTC m=+148.616855930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls") pod "image-registry-6bdc4cb5c9-tnj8s" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b") : secret "image-registry-tls" not found Apr 20 17:50:42.831743 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:42.831572 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 17:50:42.831743 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:42.831615 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls podName:3bdd8663-80e7-4491-bb84-1d576b9b56cc nodeName:}" failed. No retries permitted until 2026-04-20 17:50:46.831605555 +0000 UTC m=+148.616960776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-hr49l" (UID: "3bdd8663-80e7-4491-bb84-1d576b9b56cc") : secret "samples-operator-tls" not found Apr 20 17:50:42.932626 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:42.932557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:42.932750 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:42.932685 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:42.932750 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:42.932740 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls podName:50c85d10-1859-45c6-8b15-0c1dfc8e482e nodeName:}" failed. No retries permitted until 2026-04-20 17:50:46.932725111 +0000 UTC m=+148.718080327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wnkt8" (UID: "50c85d10-1859-45c6-8b15-0c1dfc8e482e") : secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:43.081962 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:43.081923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5m2k9" event={"ID":"f2230508-9373-4dc2-b94b-53c90c805046","Type":"ContainerStarted","Data":"f41c6d0d08e92cf91ad48c6c6f5762889aa1149f801b31c2bae2e83525937da7"} Apr 20 17:50:43.083333 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:43.083304 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" event={"ID":"b09c964e-a54b-444a-bc53-320e8e6cabe7","Type":"ContainerStarted","Data":"e6fbf7ccf46668d267644de3b02769c869678947a4d3ed6984f994f5a1c06264"} Apr 20 17:50:43.102332 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:43.102284 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-5m2k9" podStartSLOduration=1.295174254 podStartE2EDuration="4.10226918s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="2026-04-20 17:50:39.85348345 +0000 UTC m=+141.638838667" lastFinishedPulling="2026-04-20 17:50:42.660578372 +0000 UTC m=+144.445933593" observedRunningTime="2026-04-20 17:50:43.100390024 +0000 UTC m=+144.885745264" watchObservedRunningTime="2026-04-20 17:50:43.10226918 +0000 UTC m=+144.887624419" Apr 20 17:50:46.170452 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:46.170423 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6k2dk_bbc16f1f-e425-42a6-9352-b92e465bc2c2/dns-node-resolver/0.log" Apr 20 17:50:46.863314 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:46.863280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:46.863484 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:46.863347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:46.863484 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:46.863428 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 17:50:46.863555 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:46.863492 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls podName:3bdd8663-80e7-4491-bb84-1d576b9b56cc nodeName:}" failed. No retries permitted until 2026-04-20 17:50:54.863478119 +0000 UTC m=+156.648833342 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-hr49l" (UID: "3bdd8663-80e7-4491-bb84-1d576b9b56cc") : secret "samples-operator-tls" not found Apr 20 17:50:46.863555 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:46.863430 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:50:46.863555 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:46.863533 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s: secret "image-registry-tls" not found Apr 20 17:50:46.863654 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:46.863587 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls podName:fef058a6-f6d9-40db-91c2-06542c24608b nodeName:}" failed. No retries permitted until 2026-04-20 17:50:54.86357337 +0000 UTC m=+156.648928603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls") pod "image-registry-6bdc4cb5c9-tnj8s" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b") : secret "image-registry-tls" not found Apr 20 17:50:46.963995 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:46.963956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:46.964148 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:46.964097 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:46.964189 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:46.964158 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls podName:50c85d10-1859-45c6-8b15-0c1dfc8e482e nodeName:}" failed. No retries permitted until 2026-04-20 17:50:54.964144004 +0000 UTC m=+156.749499221 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wnkt8" (UID: "50c85d10-1859-45c6-8b15-0c1dfc8e482e") : secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:46.971625 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:46.971605 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6dlf2_7b143fe4-9d02-4ed6-a139-f8b9c51e336d/node-ca/0.log" Apr 20 17:50:54.674655 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:54.674608 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cxr2b" podUID="bbbb6523-e9f1-4c90-a6e2-5288b46c08ad" Apr 20 17:50:54.682750 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:54.682708 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hxnms" podUID="07e93993-3b0e-409c-9665-f091ef7a8e5a" Apr 20 17:50:54.708837 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:54.708806 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-skq27" podUID="14ea9252-57e5-4e09-9c9e-33d96e94d56f" Apr 20 17:50:54.921628 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:54.921595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:54.921791 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:54.921654 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:54.924014 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:54.923960 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bdd8663-80e7-4491-bb84-1d576b9b56cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-hr49l\" (UID: \"3bdd8663-80e7-4491-bb84-1d576b9b56cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:54.924014 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:54.923970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") pod \"image-registry-6bdc4cb5c9-tnj8s\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:54.992947 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:54.992914 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:55.002722 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:55.002699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" Apr 20 17:50:55.022825 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:55.022793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:50:55.023038 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:55.023015 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:55.023118 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:50:55.023108 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls podName:50c85d10-1859-45c6-8b15-0c1dfc8e482e nodeName:}" failed. No retries permitted until 2026-04-20 17:51:11.023084417 +0000 UTC m=+172.808439641 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wnkt8" (UID: "50c85d10-1859-45c6-8b15-0c1dfc8e482e") : secret "cluster-monitoring-operator-tls" not found Apr 20 17:50:55.112667 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:55.112646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cxr2b" Apr 20 17:50:55.112813 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:55.112646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:50:55.117170 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:55.116791 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" podStartSLOduration=13.307447889 podStartE2EDuration="16.116774816s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="2026-04-20 17:50:39.849392123 +0000 UTC m=+141.634747345" lastFinishedPulling="2026-04-20 17:50:42.658719053 +0000 UTC m=+144.444074272" observedRunningTime="2026-04-20 17:50:43.130472097 +0000 UTC m=+144.915827334" watchObservedRunningTime="2026-04-20 17:50:55.116774816 +0000 UTC m=+156.902130057" Apr 20 17:50:55.117445 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:55.117424 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s"] Apr 20 17:50:55.121487 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:50:55.121464 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfef058a6_f6d9_40db_91c2_06542c24608b.slice/crio-eb2c979530d5f9cadb7d3f80bfc819d3dc920aab81b512d533453a812ced1007 WatchSource:0}: Error finding container eb2c979530d5f9cadb7d3f80bfc819d3dc920aab81b512d533453a812ced1007: Status 404 returned error can't find the container with id eb2c979530d5f9cadb7d3f80bfc819d3dc920aab81b512d533453a812ced1007 Apr 20 17:50:55.128735 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:55.128714 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l"] Apr 20 17:50:56.115971 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:56.115936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" event={"ID":"3bdd8663-80e7-4491-bb84-1d576b9b56cc","Type":"ContainerStarted","Data":"5e81fec095b0844c2f24d202c072122761d3bb2e30e5a97369e514f459f001a6"} Apr 20 17:50:56.117177 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:56.117151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" event={"ID":"fef058a6-f6d9-40db-91c2-06542c24608b","Type":"ContainerStarted","Data":"6c5585d3eea7da1b7a67510dd6b44e9c2af312a2b7ced2fa671b9d404c2b94f9"} Apr 20 17:50:56.117273 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:56.117186 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" event={"ID":"fef058a6-f6d9-40db-91c2-06542c24608b","Type":"ContainerStarted","Data":"eb2c979530d5f9cadb7d3f80bfc819d3dc920aab81b512d533453a812ced1007"} Apr 20 17:50:56.117326 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:56.117312 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:50:56.138429 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:56.138394 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" podStartSLOduration=17.138381001 podStartE2EDuration="17.138381001s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:50:56.1370787 +0000 UTC m=+157.922433935" watchObservedRunningTime="2026-04-20 17:50:56.138381001 +0000 UTC m=+157.923736256" Apr 20 17:50:57.121366 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:57.121326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" event={"ID":"3bdd8663-80e7-4491-bb84-1d576b9b56cc","Type":"ContainerStarted","Data":"a276ee547de16a65177935c681d826ec263cbf50ac63276b8ebb5babfe66d2c2"} Apr 20 17:50:57.121743 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:57.121407 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" event={"ID":"3bdd8663-80e7-4491-bb84-1d576b9b56cc","Type":"ContainerStarted","Data":"18d645d5dc8395467e352a32c798619a8e495b1fce7124f6a939db53238d9891"} Apr 20 17:50:57.137156 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:57.137098 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-hr49l" podStartSLOduration=16.348366333 podStartE2EDuration="18.137081534s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="2026-04-20 17:50:55.171044117 +0000 UTC m=+156.956399334" lastFinishedPulling="2026-04-20 17:50:56.959759315 +0000 UTC m=+158.745114535" observedRunningTime="2026-04-20 17:50:57.136083298 +0000 UTC m=+158.921438562" watchObservedRunningTime="2026-04-20 17:50:57.137081534 +0000 UTC m=+158.922436776" Apr 20 17:50:59.560870 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.560834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:50:59.561282 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.560888 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:50:59.563240 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.563217 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbbb6523-e9f1-4c90-a6e2-5288b46c08ad-metrics-tls\") pod \"dns-default-cxr2b\" (UID: \"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad\") " pod="openshift-dns/dns-default-cxr2b" Apr 20 17:50:59.563299 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.563278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e93993-3b0e-409c-9665-f091ef7a8e5a-cert\") pod \"ingress-canary-hxnms\" (UID: \"07e93993-3b0e-409c-9665-f091ef7a8e5a\") " pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:50:59.615921 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.615888 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gkt2t\"" Apr 20 17:50:59.616631 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.616614 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-82cw8\"" Apr 20 17:50:59.623412 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.623388 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hxnms" Apr 20 17:50:59.623412 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.623406 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cxr2b" Apr 20 17:50:59.750647 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.750610 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hxnms"] Apr 20 17:50:59.754493 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:50:59.754468 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e93993_3b0e_409c_9665_f091ef7a8e5a.slice/crio-54dc06d1737184e400d86ffa577e07eb972cd184916d02d2154ad384f36eb9ee WatchSource:0}: Error finding container 54dc06d1737184e400d86ffa577e07eb972cd184916d02d2154ad384f36eb9ee: Status 404 returned error can't find the container with id 54dc06d1737184e400d86ffa577e07eb972cd184916d02d2154ad384f36eb9ee Apr 20 17:50:59.763657 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:50:59.763637 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cxr2b"] Apr 20 17:50:59.766375 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:50:59.766344 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbbb6523_e9f1_4c90_a6e2_5288b46c08ad.slice/crio-10b20189c473825fc507fe77e7536897e8755447fab27bfa2f1a8c024c364081 WatchSource:0}: Error finding container 10b20189c473825fc507fe77e7536897e8755447fab27bfa2f1a8c024c364081: Status 404 returned error can't find the container with id 10b20189c473825fc507fe77e7536897e8755447fab27bfa2f1a8c024c364081 Apr 20 17:51:00.133024 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:00.132969 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hxnms" event={"ID":"07e93993-3b0e-409c-9665-f091ef7a8e5a","Type":"ContainerStarted","Data":"54dc06d1737184e400d86ffa577e07eb972cd184916d02d2154ad384f36eb9ee"} Apr 20 17:51:00.133844 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:00.133821 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cxr2b" event={"ID":"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad","Type":"ContainerStarted","Data":"10b20189c473825fc507fe77e7536897e8755447fab27bfa2f1a8c024c364081"} Apr 20 17:51:02.140757 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:02.140659 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cxr2b" event={"ID":"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad","Type":"ContainerStarted","Data":"5ccc9468d4fecbea240cf5c6df14e8c0e97493b2aa59e34af7a9426999a6e634"} Apr 20 17:51:02.140757 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:02.140704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cxr2b" event={"ID":"bbbb6523-e9f1-4c90-a6e2-5288b46c08ad","Type":"ContainerStarted","Data":"9e9763e49316489b90a3e9b44a4d38f17637a20f9a5b0a644fee6b6108c8f83a"} Apr 20 17:51:02.141187 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:02.140777 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cxr2b" Apr 20 17:51:02.142042 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:02.142022 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hxnms" event={"ID":"07e93993-3b0e-409c-9665-f091ef7a8e5a","Type":"ContainerStarted","Data":"c9fec4b2c3e26be9002f2d3451e588ab9f676a84b4dae03b447193c97ea5bee6"} Apr 20 17:51:02.157841 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:02.157802 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cxr2b" podStartSLOduration=129.685003167 podStartE2EDuration="2m11.15779008s" podCreationTimestamp="2026-04-20 17:48:51 +0000 UTC" firstStartedPulling="2026-04-20 17:50:59.768094893 +0000 UTC m=+161.553450114" lastFinishedPulling="2026-04-20 17:51:01.240881807 +0000 UTC m=+163.026237027" observedRunningTime="2026-04-20 17:51:02.156609227 +0000 UTC m=+163.941964476" watchObservedRunningTime="2026-04-20 17:51:02.15779008 +0000 UTC m=+163.943145319" Apr 20 17:51:02.172201 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:02.172161 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hxnms" podStartSLOduration=129.106308323 podStartE2EDuration="2m11.172148103s" podCreationTimestamp="2026-04-20 17:48:51 +0000 UTC" firstStartedPulling="2026-04-20 17:50:59.756552201 +0000 UTC m=+161.541907421" lastFinishedPulling="2026-04-20 17:51:01.822391977 +0000 UTC m=+163.607747201" observedRunningTime="2026-04-20 17:51:02.171904077 +0000 UTC m=+163.957259337" watchObservedRunningTime="2026-04-20 17:51:02.172148103 +0000 UTC m=+163.957503344" Apr 20 17:51:05.691336 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:05.691297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:51:07.088462 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.088423 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s"] Apr 20 17:51:07.137419 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.137386 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2z57x"] Apr 20 17:51:07.140716 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.140696 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.143165 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.143144 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lvbdx\"" Apr 20 17:51:07.143283 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.143262 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 17:51:07.143365 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.143348 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 17:51:07.155767 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.155741 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2z57x"] Apr 20 17:51:07.219844 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.219810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ac46c7c-57b5-4d37-a219-aade31435133-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.219844 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.219847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ac46c7c-57b5-4d37-a219-aade31435133-data-volume\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.220109 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.219875 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ac46c7c-57b5-4d37-a219-aade31435133-crio-socket\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.220109 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.219941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc52h\" (UniqueName: \"kubernetes.io/projected/7ac46c7c-57b5-4d37-a219-aade31435133-kube-api-access-fc52h\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.220109 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.219972 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ac46c7c-57b5-4d37-a219-aade31435133-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.320930 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.320893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ac46c7c-57b5-4d37-a219-aade31435133-crio-socket\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.321134 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.320948 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc52h\" (UniqueName: \"kubernetes.io/projected/7ac46c7c-57b5-4d37-a219-aade31435133-kube-api-access-fc52h\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.321134 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.320974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ac46c7c-57b5-4d37-a219-aade31435133-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.321134 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.321050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ac46c7c-57b5-4d37-a219-aade31435133-crio-socket\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.321134 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.321062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ac46c7c-57b5-4d37-a219-aade31435133-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.321134 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.321084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ac46c7c-57b5-4d37-a219-aade31435133-data-volume\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.321412 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.321395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ac46c7c-57b5-4d37-a219-aade31435133-data-volume\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.321662 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.321640 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ac46c7c-57b5-4d37-a219-aade31435133-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.323316 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.323299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ac46c7c-57b5-4d37-a219-aade31435133-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.329373 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.329343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc52h\" (UniqueName: \"kubernetes.io/projected/7ac46c7c-57b5-4d37-a219-aade31435133-kube-api-access-fc52h\") pod \"insights-runtime-extractor-2z57x\" (UID: \"7ac46c7c-57b5-4d37-a219-aade31435133\") " pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.450800 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.450772 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2z57x" Apr 20 17:51:07.583840 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:07.583809 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2z57x"] Apr 20 17:51:07.587198 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:07.587173 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ac46c7c_57b5_4d37_a219_aade31435133.slice/crio-27d57728d1ede559985edceab0e3635d9d99bb46121dd0ede0f9a649553fa584 WatchSource:0}: Error finding container 27d57728d1ede559985edceab0e3635d9d99bb46121dd0ede0f9a649553fa584: Status 404 returned error can't find the container with id 27d57728d1ede559985edceab0e3635d9d99bb46121dd0ede0f9a649553fa584 Apr 20 17:51:08.158053 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:08.158016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2z57x" event={"ID":"7ac46c7c-57b5-4d37-a219-aade31435133","Type":"ContainerStarted","Data":"4029e75c851308274c0be3009d2391e285dd9e8c83bc1e471b500c7f217c1201"} Apr 20 17:51:08.158053 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:08.158052 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2z57x" event={"ID":"7ac46c7c-57b5-4d37-a219-aade31435133","Type":"ContainerStarted","Data":"27d57728d1ede559985edceab0e3635d9d99bb46121dd0ede0f9a649553fa584"} Apr 20 17:51:09.162209 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:09.162172 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2z57x" event={"ID":"7ac46c7c-57b5-4d37-a219-aade31435133","Type":"ContainerStarted","Data":"4a1d5f5799fd1a4d48c9d417795ca9c6fe08e2e4410edbeee4df173a387c14f1"} Apr 20 17:51:10.166561 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:10.166481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2z57x" event={"ID":"7ac46c7c-57b5-4d37-a219-aade31435133","Type":"ContainerStarted","Data":"0146d6e6775f2109b53153cb03d09ffad1538258c1ef337a3fbd67e5cf7cb6b0"} Apr 20 17:51:10.184430 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:10.184382 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2z57x" podStartSLOduration=0.966971737 podStartE2EDuration="3.184366773s" podCreationTimestamp="2026-04-20 17:51:07 +0000 UTC" firstStartedPulling="2026-04-20 17:51:07.638724184 +0000 UTC m=+169.424079401" lastFinishedPulling="2026-04-20 17:51:09.856119217 +0000 UTC m=+171.641474437" observedRunningTime="2026-04-20 17:51:10.182929812 +0000 UTC m=+171.968285064" watchObservedRunningTime="2026-04-20 17:51:10.184366773 +0000 UTC m=+171.969722008" Apr 20 17:51:11.053819 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:11.053778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:51:11.056266 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:11.056241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50c85d10-1859-45c6-8b15-0c1dfc8e482e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wnkt8\" (UID: \"50c85d10-1859-45c6-8b15-0c1dfc8e482e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:51:11.299495 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:11.299450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" Apr 20 17:51:11.412226 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:11.412193 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8"] Apr 20 17:51:11.415317 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:11.415291 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50c85d10_1859_45c6_8b15_0c1dfc8e482e.slice/crio-b23c6330e3224de3c1a2ec40280dfc4808bc1f616876c09f92cd4e870fdf28d0 WatchSource:0}: Error finding container b23c6330e3224de3c1a2ec40280dfc4808bc1f616876c09f92cd4e870fdf28d0: Status 404 returned error can't find the container with id b23c6330e3224de3c1a2ec40280dfc4808bc1f616876c09f92cd4e870fdf28d0 Apr 20 17:51:12.147104 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:12.147079 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cxr2b" Apr 20 17:51:12.173426 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:12.173379 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" event={"ID":"50c85d10-1859-45c6-8b15-0c1dfc8e482e","Type":"ContainerStarted","Data":"b23c6330e3224de3c1a2ec40280dfc4808bc1f616876c09f92cd4e870fdf28d0"} Apr 20 17:51:13.764708 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:13.764673 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62"] Apr 20 17:51:13.767687 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:13.767670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" Apr 20 17:51:13.769927 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:13.769905 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 17:51:13.770127 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:13.770105 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-b97bt\"" Apr 20 17:51:13.777692 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:13.777671 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62"] Apr 20 17:51:13.878231 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:13.878196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/89e5f4b0-2504-4bc5-bac9-06cc9892666b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7pr62\" (UID: \"89e5f4b0-2504-4bc5-bac9-06cc9892666b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" Apr 20 17:51:13.979542 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:13.979507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/89e5f4b0-2504-4bc5-bac9-06cc9892666b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7pr62\" (UID: \"89e5f4b0-2504-4bc5-bac9-06cc9892666b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" Apr 20 17:51:13.979678 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:51:13.979624 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 20 17:51:13.979678 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:51:13.979678 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e5f4b0-2504-4bc5-bac9-06cc9892666b-tls-certificates podName:89e5f4b0-2504-4bc5-bac9-06cc9892666b nodeName:}" failed. No retries permitted until 2026-04-20 17:51:14.479664435 +0000 UTC m=+176.265019653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/89e5f4b0-2504-4bc5-bac9-06cc9892666b-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-7pr62" (UID: "89e5f4b0-2504-4bc5-bac9-06cc9892666b") : secret "prometheus-operator-admission-webhook-tls" not found Apr 20 17:51:14.179530 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:14.179444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" event={"ID":"50c85d10-1859-45c6-8b15-0c1dfc8e482e","Type":"ContainerStarted","Data":"3e3aa05106458af147524f84302fe49d44446a9ee4aa12f523baa9ecec47ae74"} Apr 20 17:51:14.194526 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:14.194479 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wnkt8" podStartSLOduration=33.317827416 podStartE2EDuration="35.194463808s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="2026-04-20 17:51:11.41712496 +0000 UTC m=+173.202480177" lastFinishedPulling="2026-04-20 17:51:13.293761352 +0000 UTC m=+175.079116569" observedRunningTime="2026-04-20 17:51:14.193346444 +0000 UTC m=+175.978701683" watchObservedRunningTime="2026-04-20 17:51:14.194463808 +0000 UTC m=+175.979819046" Apr 20 17:51:14.482706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:14.482675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/89e5f4b0-2504-4bc5-bac9-06cc9892666b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7pr62\" (UID: \"89e5f4b0-2504-4bc5-bac9-06cc9892666b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" Apr 20 17:51:14.485225 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:14.485201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/89e5f4b0-2504-4bc5-bac9-06cc9892666b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7pr62\" (UID: \"89e5f4b0-2504-4bc5-bac9-06cc9892666b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" Apr 20 17:51:14.676051 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:14.676005 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" Apr 20 17:51:14.789798 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:14.789637 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62"] Apr 20 17:51:14.792157 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:14.792122 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e5f4b0_2504_4bc5_bac9_06cc9892666b.slice/crio-81ee8f7f76036ae5d443b95446a55cfb46e19cb03dc771d6680d4ffb86538732 WatchSource:0}: Error finding container 81ee8f7f76036ae5d443b95446a55cfb46e19cb03dc771d6680d4ffb86538732: Status 404 returned error can't find the container with id 81ee8f7f76036ae5d443b95446a55cfb46e19cb03dc771d6680d4ffb86538732 Apr 20 17:51:15.183060 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:15.182958 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" event={"ID":"89e5f4b0-2504-4bc5-bac9-06cc9892666b","Type":"ContainerStarted","Data":"81ee8f7f76036ae5d443b95446a55cfb46e19cb03dc771d6680d4ffb86538732"} Apr 20 17:51:16.186798 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.186765 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" event={"ID":"89e5f4b0-2504-4bc5-bac9-06cc9892666b","Type":"ContainerStarted","Data":"41782b53eb1d52e06fed527b28d68e0ebeb309f7ae38cdfece1f8ddce3445603"} Apr 20 17:51:16.187297 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.187015 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" Apr 20 17:51:16.191874 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.191851 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" Apr 20 17:51:16.202311 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.202267 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7pr62" podStartSLOduration=2.025957684 podStartE2EDuration="3.202252149s" podCreationTimestamp="2026-04-20 17:51:13 +0000 UTC" firstStartedPulling="2026-04-20 17:51:14.793910765 +0000 UTC m=+176.579265981" lastFinishedPulling="2026-04-20 17:51:15.970205209 +0000 UTC m=+177.755560446" observedRunningTime="2026-04-20 17:51:16.200525893 +0000 UTC m=+177.985881131" watchObservedRunningTime="2026-04-20 17:51:16.202252149 +0000 UTC m=+177.987607387" Apr 20 17:51:16.828450 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.828419 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ch9xb"] Apr 20 17:51:16.851047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.851017 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ch9xb"] Apr 20 17:51:16.851218 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.851159 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:16.853358 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.853322 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 17:51:16.853358 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.853343 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-hnmqh\"" Apr 20 17:51:16.853540 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.853327 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 17:51:16.854270 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:16.854253 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 17:51:17.002006 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.001957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.002201 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.002020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.002201 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.002088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.002201 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.002125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdbpz\" (UniqueName: \"kubernetes.io/projected/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-kube-api-access-qdbpz\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.093650 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.093572 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:51:17.103113 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.103086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdbpz\" (UniqueName: \"kubernetes.io/projected/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-kube-api-access-qdbpz\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.103214 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.103130 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.103285 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.103261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.103372 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.103358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.103903 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.103881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.105526 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.105497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.105616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.105530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.131961 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.131931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdbpz\" (UniqueName: \"kubernetes.io/projected/574c60f9-5fff-4eb3-9e05-52b18b8d24b5-kube-api-access-qdbpz\") pod \"prometheus-operator-5676c8c784-ch9xb\" (UID: \"574c60f9-5fff-4eb3-9e05-52b18b8d24b5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.159896 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.159860 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" Apr 20 17:51:17.296818 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:17.296786 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ch9xb"] Apr 20 17:51:17.300059 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:17.300035 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod574c60f9_5fff_4eb3_9e05_52b18b8d24b5.slice/crio-4ec78781aab1a9747dd0dc6f30c9e813d2d48bc8ce672172091983b2089dba9f WatchSource:0}: Error finding container 4ec78781aab1a9747dd0dc6f30c9e813d2d48bc8ce672172091983b2089dba9f: Status 404 returned error can't find the container with id 4ec78781aab1a9747dd0dc6f30c9e813d2d48bc8ce672172091983b2089dba9f Apr 20 17:51:18.193508 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:18.193476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" event={"ID":"574c60f9-5fff-4eb3-9e05-52b18b8d24b5","Type":"ContainerStarted","Data":"4ec78781aab1a9747dd0dc6f30c9e813d2d48bc8ce672172091983b2089dba9f"} Apr 20 17:51:19.198419 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:19.198379 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" event={"ID":"574c60f9-5fff-4eb3-9e05-52b18b8d24b5","Type":"ContainerStarted","Data":"38395560518e43879989c0950ec2c3dca281adef00da15957f723a9ab7f27880"} Apr 20 17:51:19.198851 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:19.198433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" event={"ID":"574c60f9-5fff-4eb3-9e05-52b18b8d24b5","Type":"ContainerStarted","Data":"c3da3f243560741d3d924c9e67431c6f116689e91422f663c0012c84d47497c2"} Apr 20 17:51:20.216114 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:20.216064 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-ch9xb" podStartSLOduration=2.573233771 podStartE2EDuration="4.216049568s" podCreationTimestamp="2026-04-20 17:51:16 +0000 UTC" firstStartedPulling="2026-04-20 17:51:17.302021265 +0000 UTC m=+179.087376495" lastFinishedPulling="2026-04-20 17:51:18.944837072 +0000 UTC m=+180.730192292" observedRunningTime="2026-04-20 17:51:20.215238322 +0000 UTC m=+182.000593584" watchObservedRunningTime="2026-04-20 17:51:20.216049568 +0000 UTC m=+182.001404807" Apr 20 17:51:22.211205 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.211171 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq"] Apr 20 17:51:22.214762 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.214743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.217087 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.217063 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-nx6sf\"" Apr 20 17:51:22.218072 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.218055 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 17:51:22.218123 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.218080 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 17:51:22.226732 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.226708 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq"] Apr 20 17:51:22.245324 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.245294 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lbj72"] Apr 20 17:51:22.248769 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.248748 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.251395 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.251376 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 17:51:22.251690 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.251671 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 17:51:22.251772 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.251727 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-75n4m\"" Apr 20 17:51:22.252004 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.251970 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 17:51:22.344142 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f24fb25-fcc9-42ab-a59b-fc3368b09772-metrics-client-ca\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.344142 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344146 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2f24fb25-fcc9-42ab-a59b-fc3368b09772-root\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.344371 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-textfile\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.344371 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-tls\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.344371 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344235 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-accelerators-collector-config\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.344371 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.344371 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84ds7\" (UniqueName: \"kubernetes.io/projected/2f24fb25-fcc9-42ab-a59b-fc3368b09772-kube-api-access-84ds7\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.344539 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ef3feee-bbc8-4c86-8037-031d564a48f4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.344539 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344407 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ef3feee-bbc8-4c86-8037-031d564a48f4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.344539 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344433 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f24fb25-fcc9-42ab-a59b-fc3368b09772-sys\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.344539 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-485xw\" (UniqueName: \"kubernetes.io/projected/1ef3feee-bbc8-4c86-8037-031d564a48f4-kube-api-access-485xw\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.344539 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-wtmp\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.344704 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.344552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ef3feee-bbc8-4c86-8037-031d564a48f4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.445377 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f24fb25-fcc9-42ab-a59b-fc3368b09772-metrics-client-ca\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445377 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445378 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2f24fb25-fcc9-42ab-a59b-fc3368b09772-root\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445593 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-textfile\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445593 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2f24fb25-fcc9-42ab-a59b-fc3368b09772-root\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445593 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-tls\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445593 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-accelerators-collector-config\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445593 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445832 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445604 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84ds7\" (UniqueName: \"kubernetes.io/projected/2f24fb25-fcc9-42ab-a59b-fc3368b09772-kube-api-access-84ds7\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445832 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ef3feee-bbc8-4c86-8037-031d564a48f4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.445832 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ef3feee-bbc8-4c86-8037-031d564a48f4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.445832 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:51:22.445677 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 17:51:22.445832 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f24fb25-fcc9-42ab-a59b-fc3368b09772-sys\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445832 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:51:22.445764 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-tls podName:2f24fb25-fcc9-42ab-a59b-fc3368b09772 nodeName:}" failed. No retries permitted until 2026-04-20 17:51:22.945725852 +0000 UTC m=+184.731081084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-tls") pod "node-exporter-lbj72" (UID: "2f24fb25-fcc9-42ab-a59b-fc3368b09772") : secret "node-exporter-tls" not found Apr 20 17:51:22.445832 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f24fb25-fcc9-42ab-a59b-fc3368b09772-sys\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.445832 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-textfile\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.446274 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-485xw\" (UniqueName: \"kubernetes.io/projected/1ef3feee-bbc8-4c86-8037-031d564a48f4-kube-api-access-485xw\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.446274 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-wtmp\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.446274 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.445917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ef3feee-bbc8-4c86-8037-031d564a48f4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.446274 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.446061 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f24fb25-fcc9-42ab-a59b-fc3368b09772-metrics-client-ca\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.446274 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.446200 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-wtmp\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.446517 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.446294 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-accelerators-collector-config\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.446572 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.446558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ef3feee-bbc8-4c86-8037-031d564a48f4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.448199 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.448173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.448341 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.448217 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ef3feee-bbc8-4c86-8037-031d564a48f4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.448444 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.448424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ef3feee-bbc8-4c86-8037-031d564a48f4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.455618 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.455598 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-qc6vc"] Apr 20 17:51:22.464168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.464109 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-qc6vc" Apr 20 17:51:22.466968 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.466951 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 17:51:22.467150 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.467134 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-h5zt8\"" Apr 20 17:51:22.467275 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.467254 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 17:51:22.469249 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.469230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84ds7\" (UniqueName: \"kubernetes.io/projected/2f24fb25-fcc9-42ab-a59b-fc3368b09772-kube-api-access-84ds7\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.473054 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.473031 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-qc6vc"] Apr 20 17:51:22.475959 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.475936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-485xw\" (UniqueName: \"kubernetes.io/projected/1ef3feee-bbc8-4c86-8037-031d564a48f4-kube-api-access-485xw\") pod \"openshift-state-metrics-9d44df66c-8kwzq\" (UID: \"1ef3feee-bbc8-4c86-8037-031d564a48f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.524138 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.524112 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" Apr 20 17:51:22.547192 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.547163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk6gj\" (UniqueName: \"kubernetes.io/projected/6e1318cc-1767-485b-b8cc-a2fbce6fcf9a-kube-api-access-xk6gj\") pod \"downloads-6bcc868b7-qc6vc\" (UID: \"6e1318cc-1767-485b-b8cc-a2fbce6fcf9a\") " pod="openshift-console/downloads-6bcc868b7-qc6vc" Apr 20 17:51:22.641388 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.641357 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq"] Apr 20 17:51:22.644459 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:22.644433 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ef3feee_bbc8_4c86_8037_031d564a48f4.slice/crio-bbc271688ceb7873b94e69400a39343b1354a0dce1e9896246f0a5574b866017 WatchSource:0}: Error finding container bbc271688ceb7873b94e69400a39343b1354a0dce1e9896246f0a5574b866017: Status 404 returned error can't find the container with id bbc271688ceb7873b94e69400a39343b1354a0dce1e9896246f0a5574b866017 Apr 20 17:51:22.648478 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.648455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk6gj\" (UniqueName: \"kubernetes.io/projected/6e1318cc-1767-485b-b8cc-a2fbce6fcf9a-kube-api-access-xk6gj\") pod \"downloads-6bcc868b7-qc6vc\" (UID: \"6e1318cc-1767-485b-b8cc-a2fbce6fcf9a\") " pod="openshift-console/downloads-6bcc868b7-qc6vc" Apr 20 17:51:22.659613 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.659593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk6gj\" (UniqueName: \"kubernetes.io/projected/6e1318cc-1767-485b-b8cc-a2fbce6fcf9a-kube-api-access-xk6gj\") pod \"downloads-6bcc868b7-qc6vc\" (UID: \"6e1318cc-1767-485b-b8cc-a2fbce6fcf9a\") " pod="openshift-console/downloads-6bcc868b7-qc6vc" Apr 20 17:51:22.784626 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.784592 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-qc6vc" Apr 20 17:51:22.903576 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.903547 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-qc6vc"] Apr 20 17:51:22.906661 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:22.906631 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e1318cc_1767_485b_b8cc_a2fbce6fcf9a.slice/crio-579af6c2618998d5837decd1ccc12d93111382dab387aa203cc178eaba8d88db WatchSource:0}: Error finding container 579af6c2618998d5837decd1ccc12d93111382dab387aa203cc178eaba8d88db: Status 404 returned error can't find the container with id 579af6c2618998d5837decd1ccc12d93111382dab387aa203cc178eaba8d88db Apr 20 17:51:22.951541 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:22.951507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-tls\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:22.951667 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:51:22.951643 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 17:51:22.951729 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:51:22.951711 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-tls podName:2f24fb25-fcc9-42ab-a59b-fc3368b09772 nodeName:}" failed. No retries permitted until 2026-04-20 17:51:23.951691521 +0000 UTC m=+185.737046741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-tls") pod "node-exporter-lbj72" (UID: "2f24fb25-fcc9-42ab-a59b-fc3368b09772") : secret "node-exporter-tls" not found Apr 20 17:51:23.210958 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.210921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" event={"ID":"1ef3feee-bbc8-4c86-8037-031d564a48f4","Type":"ContainerStarted","Data":"1b4135bfce637ff9490d8579fd3be90b11b970f3534912ffe42cd2b0e599f185"} Apr 20 17:51:23.210958 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.210966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" event={"ID":"1ef3feee-bbc8-4c86-8037-031d564a48f4","Type":"ContainerStarted","Data":"29c3a7dae5ad193dba33e0f8b83a8b1808b0c7fbece2acc5c53e618207424ed4"} Apr 20 17:51:23.211582 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.210980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" event={"ID":"1ef3feee-bbc8-4c86-8037-031d564a48f4","Type":"ContainerStarted","Data":"bbc271688ceb7873b94e69400a39343b1354a0dce1e9896246f0a5574b866017"} Apr 20 17:51:23.211946 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.211921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-qc6vc" event={"ID":"6e1318cc-1767-485b-b8cc-a2fbce6fcf9a","Type":"ContainerStarted","Data":"579af6c2618998d5837decd1ccc12d93111382dab387aa203cc178eaba8d88db"} Apr 20 17:51:23.255152 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.255116 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 17:51:23.260435 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.260413 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.262787 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.262765 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 17:51:23.262896 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.262765 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 17:51:23.263063 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.263043 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 17:51:23.263195 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.263102 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 17:51:23.263195 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.263101 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 17:51:23.263195 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.263168 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 17:51:23.263355 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.263217 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-qvclr\"" Apr 20 17:51:23.263355 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.263244 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 17:51:23.263355 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.263304 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 17:51:23.263355 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.263315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 17:51:23.277046 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.276345 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 17:51:23.355269 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg6h6\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-kube-api-access-tg6h6\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355269 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-volume\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355353 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-web-config\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355484 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355784 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355514 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355784 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355560 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355784 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355784 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355784 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.355784 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.355758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-out\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.456887 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.456848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457059 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.456905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457059 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.456954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457059 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457059 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-out\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tg6h6\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-kube-api-access-tg6h6\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-volume\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-web-config\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.457646 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.458112 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.457689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.459314 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.459287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.461168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.461095 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.461168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.461127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-web-config\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.461587 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.461561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.465226 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.465170 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.466061 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.466017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.466665 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.466577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-volume\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.466761 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.466666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.467289 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.467264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.467850 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.467828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.468412 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.468389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg6h6\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-kube-api-access-tg6h6\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.469125 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.469069 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-out\") pod \"alertmanager-main-0\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.570418 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.570379 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:51:23.951641 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.951603 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 17:51:23.956239 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:23.956195 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd5c0166_30aa_4c85_84ad_28d9a7223bf1.slice/crio-eccc7b9785ff358da15f9ad9a2595b400fd846194755eb34dae54af1f8caf3f4 WatchSource:0}: Error finding container eccc7b9785ff358da15f9ad9a2595b400fd846194755eb34dae54af1f8caf3f4: Status 404 returned error can't find the container with id eccc7b9785ff358da15f9ad9a2595b400fd846194755eb34dae54af1f8caf3f4 Apr 20 17:51:23.963292 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.962923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-tls\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:23.966790 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:23.966742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2f24fb25-fcc9-42ab-a59b-fc3368b09772-node-exporter-tls\") pod \"node-exporter-lbj72\" (UID: \"2f24fb25-fcc9-42ab-a59b-fc3368b09772\") " pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:24.057319 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:24.057240 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lbj72" Apr 20 17:51:24.068427 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:24.068394 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f24fb25_fcc9_42ab_a59b_fc3368b09772.slice/crio-69ed35c31486eaea6b9a23be08071230926fee5cd0a811b009c4869dd38836a8 WatchSource:0}: Error finding container 69ed35c31486eaea6b9a23be08071230926fee5cd0a811b009c4869dd38836a8: Status 404 returned error can't find the container with id 69ed35c31486eaea6b9a23be08071230926fee5cd0a811b009c4869dd38836a8 Apr 20 17:51:24.217701 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:24.217652 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" event={"ID":"1ef3feee-bbc8-4c86-8037-031d564a48f4","Type":"ContainerStarted","Data":"f0b812420ba11dcb73a1cfbf558bcc961a6609da0edb5a844216c5a91bee22ef"} Apr 20 17:51:24.219065 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:24.219013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerStarted","Data":"eccc7b9785ff358da15f9ad9a2595b400fd846194755eb34dae54af1f8caf3f4"} Apr 20 17:51:24.220344 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:24.220306 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lbj72" event={"ID":"2f24fb25-fcc9-42ab-a59b-fc3368b09772","Type":"ContainerStarted","Data":"69ed35c31486eaea6b9a23be08071230926fee5cd0a811b009c4869dd38836a8"} Apr 20 17:51:24.237075 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:24.237029 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kwzq" podStartSLOduration=1.15279922 podStartE2EDuration="2.237015344s" podCreationTimestamp="2026-04-20 17:51:22 +0000 UTC" firstStartedPulling="2026-04-20 17:51:22.756470807 +0000 UTC m=+184.541826029" lastFinishedPulling="2026-04-20 17:51:23.840686922 +0000 UTC m=+185.626042153" observedRunningTime="2026-04-20 17:51:24.235622137 +0000 UTC m=+186.020977388" watchObservedRunningTime="2026-04-20 17:51:24.237015344 +0000 UTC m=+186.022370582" Apr 20 17:51:26.228968 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.228931 2577 generic.go:358] "Generic (PLEG): container finished" podID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerID="5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746" exitCode=0 Apr 20 17:51:26.229563 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.229029 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerDied","Data":"5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746"} Apr 20 17:51:26.231101 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.231036 2577 generic.go:358] "Generic (PLEG): container finished" podID="2f24fb25-fcc9-42ab-a59b-fc3368b09772" containerID="b791f786afc863e37f4a1e662763563d15d186ca80ef27a5760af0bd25173e60" exitCode=0 Apr 20 17:51:26.231284 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.231105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lbj72" event={"ID":"2f24fb25-fcc9-42ab-a59b-fc3368b09772","Type":"ContainerDied","Data":"b791f786afc863e37f4a1e662763563d15d186ca80ef27a5760af0bd25173e60"} Apr 20 17:51:26.604397 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.604313 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-59bcd769fc-52sq2"] Apr 20 17:51:26.607740 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.607711 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.610095 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.610069 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 17:51:26.610226 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.610126 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 17:51:26.610226 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.610165 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 17:51:26.610226 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.610204 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-5mw9m\"" Apr 20 17:51:26.611137 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.611119 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-70na9hvpfglh7\"" Apr 20 17:51:26.611190 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.611135 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 17:51:26.615752 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.615730 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59bcd769fc-52sq2"] Apr 20 17:51:26.689056 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.689022 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a74d5e0-5721-4316-a286-11a620733ba1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.689249 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.689108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46g5m\" (UniqueName: \"kubernetes.io/projected/8a74d5e0-5721-4316-a286-11a620733ba1-kube-api-access-46g5m\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.689249 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.689140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8a74d5e0-5721-4316-a286-11a620733ba1-metrics-server-audit-profiles\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.689249 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.689176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8a74d5e0-5721-4316-a286-11a620733ba1-secret-metrics-server-tls\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.689249 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.689200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a74d5e0-5721-4316-a286-11a620733ba1-client-ca-bundle\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.689442 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.689254 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8a74d5e0-5721-4316-a286-11a620733ba1-audit-log\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.689442 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.689292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8a74d5e0-5721-4316-a286-11a620733ba1-secret-metrics-server-client-certs\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.790209 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.790177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8a74d5e0-5721-4316-a286-11a620733ba1-metrics-server-audit-profiles\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.790385 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.790229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8a74d5e0-5721-4316-a286-11a620733ba1-secret-metrics-server-tls\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.790385 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.790257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a74d5e0-5721-4316-a286-11a620733ba1-client-ca-bundle\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.790385 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.790307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8a74d5e0-5721-4316-a286-11a620733ba1-audit-log\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.790385 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.790337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8a74d5e0-5721-4316-a286-11a620733ba1-secret-metrics-server-client-certs\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.790557 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.790384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a74d5e0-5721-4316-a286-11a620733ba1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.790557 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.790451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46g5m\" (UniqueName: \"kubernetes.io/projected/8a74d5e0-5721-4316-a286-11a620733ba1-kube-api-access-46g5m\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.790797 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.790761 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8a74d5e0-5721-4316-a286-11a620733ba1-audit-log\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.791395 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.791359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8a74d5e0-5721-4316-a286-11a620733ba1-metrics-server-audit-profiles\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.791815 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.791792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a74d5e0-5721-4316-a286-11a620733ba1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.793365 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.793325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8a74d5e0-5721-4316-a286-11a620733ba1-secret-metrics-server-client-certs\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.794078 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.794056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a74d5e0-5721-4316-a286-11a620733ba1-client-ca-bundle\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.794508 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.794485 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8a74d5e0-5721-4316-a286-11a620733ba1-secret-metrics-server-tls\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.799898 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.799856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46g5m\" (UniqueName: \"kubernetes.io/projected/8a74d5e0-5721-4316-a286-11a620733ba1-kube-api-access-46g5m\") pod \"metrics-server-59bcd769fc-52sq2\" (UID: \"8a74d5e0-5721-4316-a286-11a620733ba1\") " pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:26.918331 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:26.917874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:27.076826 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:27.076722 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59bcd769fc-52sq2"] Apr 20 17:51:27.079548 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:27.079508 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a74d5e0_5721_4316_a286_11a620733ba1.slice/crio-aa9e04390f76ae14e38dce27fde9ad97f20a882bf8696cbcc10d7eb40a16166f WatchSource:0}: Error finding container aa9e04390f76ae14e38dce27fde9ad97f20a882bf8696cbcc10d7eb40a16166f: Status 404 returned error can't find the container with id aa9e04390f76ae14e38dce27fde9ad97f20a882bf8696cbcc10d7eb40a16166f Apr 20 17:51:27.237607 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:27.237475 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lbj72" event={"ID":"2f24fb25-fcc9-42ab-a59b-fc3368b09772","Type":"ContainerStarted","Data":"44c4e5cb96feda6d9a39d2e6eec2506e64f426bf68a63079695ab7b2948a54bf"} Apr 20 17:51:27.237607 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:27.237528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lbj72" event={"ID":"2f24fb25-fcc9-42ab-a59b-fc3368b09772","Type":"ContainerStarted","Data":"2b7e7bbbf9656542267545c4e09ba9b3a4132f944a7935f443d13018b8717712"} Apr 20 17:51:27.239284 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:27.239214 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" event={"ID":"8a74d5e0-5721-4316-a286-11a620733ba1","Type":"ContainerStarted","Data":"aa9e04390f76ae14e38dce27fde9ad97f20a882bf8696cbcc10d7eb40a16166f"} Apr 20 17:51:27.257279 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:27.257215 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lbj72" podStartSLOduration=4.155869667 podStartE2EDuration="5.257197242s" podCreationTimestamp="2026-04-20 17:51:22 +0000 UTC" firstStartedPulling="2026-04-20 17:51:24.070714529 +0000 UTC m=+185.856069761" lastFinishedPulling="2026-04-20 17:51:25.172042106 +0000 UTC m=+186.957397336" observedRunningTime="2026-04-20 17:51:27.255533339 +0000 UTC m=+189.040888590" watchObservedRunningTime="2026-04-20 17:51:27.257197242 +0000 UTC m=+189.042552491" Apr 20 17:51:28.246965 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:28.246919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerStarted","Data":"5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d"} Apr 20 17:51:28.247427 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:28.246973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerStarted","Data":"7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5"} Apr 20 17:51:28.247427 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:28.247005 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerStarted","Data":"bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab"} Apr 20 17:51:28.247427 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:28.247023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerStarted","Data":"1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae"} Apr 20 17:51:28.247427 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:28.247036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerStarted","Data":"b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82"} Apr 20 17:51:29.252671 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:29.252636 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" event={"ID":"8a74d5e0-5721-4316-a286-11a620733ba1","Type":"ContainerStarted","Data":"19dedafd0f3e6a909af9341b1179f9ac9f2eaf845442cefef4d1872102057caf"} Apr 20 17:51:29.274491 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:29.274444 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" podStartSLOduration=1.275279792 podStartE2EDuration="3.27442895s" podCreationTimestamp="2026-04-20 17:51:26 +0000 UTC" firstStartedPulling="2026-04-20 17:51:27.082285365 +0000 UTC m=+188.867640583" lastFinishedPulling="2026-04-20 17:51:29.081434508 +0000 UTC m=+190.866789741" observedRunningTime="2026-04-20 17:51:29.272751492 +0000 UTC m=+191.058106744" watchObservedRunningTime="2026-04-20 17:51:29.27442895 +0000 UTC m=+191.059784238" Apr 20 17:51:30.259593 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:30.259555 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerStarted","Data":"dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb"} Apr 20 17:51:30.287431 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:30.287378 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.776209219 podStartE2EDuration="7.287363718s" podCreationTimestamp="2026-04-20 17:51:23 +0000 UTC" firstStartedPulling="2026-04-20 17:51:23.958206092 +0000 UTC m=+185.743561309" lastFinishedPulling="2026-04-20 17:51:29.469360589 +0000 UTC m=+191.254715808" observedRunningTime="2026-04-20 17:51:30.2853277 +0000 UTC m=+192.070682942" watchObservedRunningTime="2026-04-20 17:51:30.287363718 +0000 UTC m=+192.072718957" Apr 20 17:51:32.107055 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.106979 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" podUID="fef058a6-f6d9-40db-91c2-06542c24608b" containerName="registry" containerID="cri-o://6c5585d3eea7da1b7a67510dd6b44e9c2af312a2b7ced2fa671b9d404c2b94f9" gracePeriod=30 Apr 20 17:51:32.171034 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.170963 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66cbfb5c77-pz5kk"] Apr 20 17:51:32.174478 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.174446 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.178436 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.178406 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 17:51:32.178436 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.178429 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 17:51:32.178584 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.178452 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vttkl\"" Apr 20 17:51:32.178584 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.178460 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 17:51:32.178584 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.178535 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 17:51:32.178818 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.178801 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 17:51:32.184820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.184798 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66cbfb5c77-pz5kk"] Apr 20 17:51:32.269274 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.269230 2577 generic.go:358] "Generic (PLEG): container finished" podID="fef058a6-f6d9-40db-91c2-06542c24608b" containerID="6c5585d3eea7da1b7a67510dd6b44e9c2af312a2b7ced2fa671b9d404c2b94f9" exitCode=0 Apr 20 17:51:32.269445 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.269300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" event={"ID":"fef058a6-f6d9-40db-91c2-06542c24608b","Type":"ContainerDied","Data":"6c5585d3eea7da1b7a67510dd6b44e9c2af312a2b7ced2fa671b9d404c2b94f9"} Apr 20 17:51:32.347590 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.347554 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-service-ca\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.347801 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.347658 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-oauth-serving-cert\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.347801 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.347701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-console-config\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.347976 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.347802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-oauth-config\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.347976 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.347856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqjnz\" (UniqueName: \"kubernetes.io/projected/e5070e42-5591-4136-8a46-32414bbab297-kube-api-access-dqjnz\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.347976 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.347905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-serving-cert\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.369832 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.369773 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:51:32.449151 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.449117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-oauth-serving-cert\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.449347 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.449157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-console-config\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.449347 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.449216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-oauth-config\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.449347 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.449260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqjnz\" (UniqueName: \"kubernetes.io/projected/e5070e42-5591-4136-8a46-32414bbab297-kube-api-access-dqjnz\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.449347 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.449295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-serving-cert\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.449479 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.449371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-service-ca\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.449952 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.449921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-oauth-serving-cert\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.450085 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.449968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-console-config\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.450085 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.450007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-service-ca\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.452212 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.452186 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-serving-cert\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.452328 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.452312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-oauth-config\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.457426 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.457398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqjnz\" (UniqueName: \"kubernetes.io/projected/e5070e42-5591-4136-8a46-32414bbab297-kube-api-access-dqjnz\") pod \"console-66cbfb5c77-pz5kk\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.488351 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.488318 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:51:32.550429 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.550362 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-installation-pull-secrets\") pod \"fef058a6-f6d9-40db-91c2-06542c24608b\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " Apr 20 17:51:32.550429 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.550426 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-registry-certificates\") pod \"fef058a6-f6d9-40db-91c2-06542c24608b\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " Apr 20 17:51:32.550649 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.550490 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-image-registry-private-configuration\") pod \"fef058a6-f6d9-40db-91c2-06542c24608b\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " Apr 20 17:51:32.550649 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.550529 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pt2v\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-kube-api-access-2pt2v\") pod \"fef058a6-f6d9-40db-91c2-06542c24608b\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " Apr 20 17:51:32.550649 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.550557 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-bound-sa-token\") pod \"fef058a6-f6d9-40db-91c2-06542c24608b\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " Apr 20 17:51:32.550649 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.550587 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-trusted-ca\") pod \"fef058a6-f6d9-40db-91c2-06542c24608b\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " Apr 20 17:51:32.550649 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.550613 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") pod \"fef058a6-f6d9-40db-91c2-06542c24608b\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " Apr 20 17:51:32.550873 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.550669 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fef058a6-f6d9-40db-91c2-06542c24608b-ca-trust-extracted\") pod \"fef058a6-f6d9-40db-91c2-06542c24608b\" (UID: \"fef058a6-f6d9-40db-91c2-06542c24608b\") " Apr 20 17:51:32.551270 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.551211 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fef058a6-f6d9-40db-91c2-06542c24608b" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:51:32.551270 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.551239 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fef058a6-f6d9-40db-91c2-06542c24608b" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:51:32.554215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.554184 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fef058a6-f6d9-40db-91c2-06542c24608b" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:51:32.556283 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.556243 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fef058a6-f6d9-40db-91c2-06542c24608b" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:51:32.556513 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.556465 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fef058a6-f6d9-40db-91c2-06542c24608b" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:51:32.556513 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.556475 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fef058a6-f6d9-40db-91c2-06542c24608b" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:51:32.559457 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.559414 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-kube-api-access-2pt2v" (OuterVolumeSpecName: "kube-api-access-2pt2v") pod "fef058a6-f6d9-40db-91c2-06542c24608b" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b"). InnerVolumeSpecName "kube-api-access-2pt2v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:51:32.563192 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.563145 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fef058a6-f6d9-40db-91c2-06542c24608b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fef058a6-f6d9-40db-91c2-06542c24608b" (UID: "fef058a6-f6d9-40db-91c2-06542c24608b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:51:32.634540 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.634461 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66cbfb5c77-pz5kk"] Apr 20 17:51:32.637779 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:32.637747 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5070e42_5591_4136_8a46_32414bbab297.slice/crio-1cfbdba3ae2dff80e92cfa599b4d4b97b8ac86e577a8e9a5c345e594a70bd425 WatchSource:0}: Error finding container 1cfbdba3ae2dff80e92cfa599b4d4b97b8ac86e577a8e9a5c345e594a70bd425: Status 404 returned error can't find the container with id 1cfbdba3ae2dff80e92cfa599b4d4b97b8ac86e577a8e9a5c345e594a70bd425 Apr 20 17:51:32.651385 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.651362 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-installation-pull-secrets\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:51:32.651578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.651389 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-registry-certificates\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:51:32.651578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.651407 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fef058a6-f6d9-40db-91c2-06542c24608b-image-registry-private-configuration\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:51:32.651578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.651422 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pt2v\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-kube-api-access-2pt2v\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:51:32.651578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.651435 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-bound-sa-token\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:51:32.651578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.651448 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fef058a6-f6d9-40db-91c2-06542c24608b-trusted-ca\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:51:32.651578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.651460 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef058a6-f6d9-40db-91c2-06542c24608b-registry-tls\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:51:32.651578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:32.651473 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fef058a6-f6d9-40db-91c2-06542c24608b-ca-trust-extracted\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:51:33.274780 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:33.274738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cbfb5c77-pz5kk" event={"ID":"e5070e42-5591-4136-8a46-32414bbab297","Type":"ContainerStarted","Data":"1cfbdba3ae2dff80e92cfa599b4d4b97b8ac86e577a8e9a5c345e594a70bd425"} Apr 20 17:51:33.276126 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:33.276096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" event={"ID":"fef058a6-f6d9-40db-91c2-06542c24608b","Type":"ContainerDied","Data":"eb2c979530d5f9cadb7d3f80bfc819d3dc920aab81b512d533453a812ced1007"} Apr 20 17:51:33.276425 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:33.276144 2577 scope.go:117] "RemoveContainer" containerID="6c5585d3eea7da1b7a67510dd6b44e9c2af312a2b7ced2fa671b9d404c2b94f9" Apr 20 17:51:33.276425 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:33.276295 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s" Apr 20 17:51:33.293751 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:33.293727 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s"] Apr 20 17:51:33.297392 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:33.297367 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6bdc4cb5c9-tnj8s"] Apr 20 17:51:34.695111 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:34.695059 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fef058a6-f6d9-40db-91c2-06542c24608b" path="/var/lib/kubelet/pods/fef058a6-f6d9-40db-91c2-06542c24608b/volumes" Apr 20 17:51:35.761422 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.761390 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f66cdd75b-lwp8g"] Apr 20 17:51:35.761821 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.761755 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fef058a6-f6d9-40db-91c2-06542c24608b" containerName="registry" Apr 20 17:51:35.761821 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.761782 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef058a6-f6d9-40db-91c2-06542c24608b" containerName="registry" Apr 20 17:51:35.761913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.761884 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="fef058a6-f6d9-40db-91c2-06542c24608b" containerName="registry" Apr 20 17:51:35.766212 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.766186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.774028 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.774002 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 17:51:35.775070 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.774735 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f66cdd75b-lwp8g"] Apr 20 17:51:35.885374 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.885337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-serving-cert\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.885374 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.885375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-service-ca\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.885611 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.885567 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-config\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.885611 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.885607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-trusted-ca-bundle\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.885719 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.885661 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-oauth-serving-cert\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.885813 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.885790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snsnw\" (UniqueName: \"kubernetes.io/projected/acb369d6-c0db-4b6a-bb34-17bc911f2932-kube-api-access-snsnw\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.885878 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.885831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-oauth-config\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.986303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.986268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snsnw\" (UniqueName: \"kubernetes.io/projected/acb369d6-c0db-4b6a-bb34-17bc911f2932-kube-api-access-snsnw\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.986303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.986300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-oauth-config\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.986516 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.986447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-serving-cert\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.986516 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.986496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-service-ca\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.986630 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.986585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-config\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.986630 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.986615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-trusted-ca-bundle\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.986732 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.986655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-oauth-serving-cert\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.987363 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.987314 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-service-ca\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.987471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.987409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-config\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.987587 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.987563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-oauth-serving-cert\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.987587 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.987578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-trusted-ca-bundle\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.989289 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.989269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-oauth-config\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.989441 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.989411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-serving-cert\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:35.994425 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:35.994399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snsnw\" (UniqueName: \"kubernetes.io/projected/acb369d6-c0db-4b6a-bb34-17bc911f2932-kube-api-access-snsnw\") pod \"console-f66cdd75b-lwp8g\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:36.078344 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:36.078260 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:39.931676 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:39.931646 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f66cdd75b-lwp8g"] Apr 20 17:51:39.935101 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:51:39.935073 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacb369d6_c0db_4b6a_bb34_17bc911f2932.slice/crio-df036202315bac546577017d6e044241a11b663e893d46b8522ee8087452106c WatchSource:0}: Error finding container df036202315bac546577017d6e044241a11b663e893d46b8522ee8087452106c: Status 404 returned error can't find the container with id df036202315bac546577017d6e044241a11b663e893d46b8522ee8087452106c Apr 20 17:51:40.302692 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:40.302634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-qc6vc" event={"ID":"6e1318cc-1767-485b-b8cc-a2fbce6fcf9a","Type":"ContainerStarted","Data":"356c185fe7dd79a754fdfa162217ea9de67d50b6cf89c19d7bb1ee326554d257"} Apr 20 17:51:40.303077 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:40.303036 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-qc6vc" Apr 20 17:51:40.305815 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:40.305784 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f66cdd75b-lwp8g" event={"ID":"acb369d6-c0db-4b6a-bb34-17bc911f2932","Type":"ContainerStarted","Data":"df036202315bac546577017d6e044241a11b663e893d46b8522ee8087452106c"} Apr 20 17:51:40.319880 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:40.319852 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-qc6vc" Apr 20 17:51:40.322437 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:40.322384 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-qc6vc" podStartSLOduration=1.365233551 podStartE2EDuration="18.322367071s" podCreationTimestamp="2026-04-20 17:51:22 +0000 UTC" firstStartedPulling="2026-04-20 17:51:22.908791889 +0000 UTC m=+184.694147109" lastFinishedPulling="2026-04-20 17:51:39.865925412 +0000 UTC m=+201.651280629" observedRunningTime="2026-04-20 17:51:40.32050121 +0000 UTC m=+202.105856449" watchObservedRunningTime="2026-04-20 17:51:40.322367071 +0000 UTC m=+202.107722311" Apr 20 17:51:44.321352 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:44.321260 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f66cdd75b-lwp8g" event={"ID":"acb369d6-c0db-4b6a-bb34-17bc911f2932","Type":"ContainerStarted","Data":"6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132"} Apr 20 17:51:44.322848 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:44.322805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cbfb5c77-pz5kk" event={"ID":"e5070e42-5591-4136-8a46-32414bbab297","Type":"ContainerStarted","Data":"2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17"} Apr 20 17:51:44.344340 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:44.344287 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f66cdd75b-lwp8g" podStartSLOduration=5.419247162 podStartE2EDuration="9.344271652s" podCreationTimestamp="2026-04-20 17:51:35 +0000 UTC" firstStartedPulling="2026-04-20 17:51:39.937378777 +0000 UTC m=+201.722733995" lastFinishedPulling="2026-04-20 17:51:43.862403255 +0000 UTC m=+205.647758485" observedRunningTime="2026-04-20 17:51:44.34237128 +0000 UTC m=+206.127726535" watchObservedRunningTime="2026-04-20 17:51:44.344271652 +0000 UTC m=+206.129626892" Apr 20 17:51:44.365149 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:44.365096 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66cbfb5c77-pz5kk" podStartSLOduration=1.149782727 podStartE2EDuration="12.365078887s" podCreationTimestamp="2026-04-20 17:51:32 +0000 UTC" firstStartedPulling="2026-04-20 17:51:32.639897084 +0000 UTC m=+194.425252301" lastFinishedPulling="2026-04-20 17:51:43.855193242 +0000 UTC m=+205.640548461" observedRunningTime="2026-04-20 17:51:44.363664804 +0000 UTC m=+206.149020048" watchObservedRunningTime="2026-04-20 17:51:44.365078887 +0000 UTC m=+206.150434125" Apr 20 17:51:46.079149 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:46.079110 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:46.079602 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:46.079160 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:46.084679 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:46.084648 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:46.333865 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:46.333784 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:51:46.380099 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:46.380065 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66cbfb5c77-pz5kk"] Apr 20 17:51:46.918841 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:46.918804 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:46.919032 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:46.918853 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:51:52.488670 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:51:52.488635 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:52:03.379216 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:03.379129 2577 generic.go:358] "Generic (PLEG): container finished" podID="b09c964e-a54b-444a-bc53-320e8e6cabe7" containerID="e6fbf7ccf46668d267644de3b02769c869678947a4d3ed6984f994f5a1c06264" exitCode=0 Apr 20 17:52:03.379216 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:03.379182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" event={"ID":"b09c964e-a54b-444a-bc53-320e8e6cabe7","Type":"ContainerDied","Data":"e6fbf7ccf46668d267644de3b02769c869678947a4d3ed6984f994f5a1c06264"} Apr 20 17:52:03.379614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:03.379476 2577 scope.go:117] "RemoveContainer" containerID="e6fbf7ccf46668d267644de3b02769c869678947a4d3ed6984f994f5a1c06264" Apr 20 17:52:04.383741 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:04.383707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kpmds" event={"ID":"b09c964e-a54b-444a-bc53-320e8e6cabe7","Type":"ContainerStarted","Data":"cba409141d0d99761d4118ad375a9f296887075e303343e4b6906b0040064e0f"} Apr 20 17:52:06.924292 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:06.924262 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:52:06.928859 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:06.928837 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-59bcd769fc-52sq2" Apr 20 17:52:08.396056 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:08.396026 2577 generic.go:358] "Generic (PLEG): container finished" podID="f2230508-9373-4dc2-b94b-53c90c805046" containerID="f41c6d0d08e92cf91ad48c6c6f5762889aa1149f801b31c2bae2e83525937da7" exitCode=0 Apr 20 17:52:08.396433 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:08.396096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5m2k9" event={"ID":"f2230508-9373-4dc2-b94b-53c90c805046","Type":"ContainerDied","Data":"f41c6d0d08e92cf91ad48c6c6f5762889aa1149f801b31c2bae2e83525937da7"} Apr 20 17:52:08.396433 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:08.396408 2577 scope.go:117] "RemoveContainer" containerID="f41c6d0d08e92cf91ad48c6c6f5762889aa1149f801b31c2bae2e83525937da7" Apr 20 17:52:09.400812 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:09.400780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5m2k9" event={"ID":"f2230508-9373-4dc2-b94b-53c90c805046","Type":"ContainerStarted","Data":"43b6d9c12729415a16ee9c7f2412116aa53a8e9e8c255ecd4f51585f3f02c994"} Apr 20 17:52:11.402898 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.402837 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66cbfb5c77-pz5kk" podUID="e5070e42-5591-4136-8a46-32414bbab297" containerName="console" containerID="cri-o://2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17" gracePeriod=15 Apr 20 17:52:11.657047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.656972 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66cbfb5c77-pz5kk_e5070e42-5591-4136-8a46-32414bbab297/console/0.log" Apr 20 17:52:11.657152 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.657051 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:52:11.818383 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.818354 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-serving-cert\") pod \"e5070e42-5591-4136-8a46-32414bbab297\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " Apr 20 17:52:11.818620 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.818409 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqjnz\" (UniqueName: \"kubernetes.io/projected/e5070e42-5591-4136-8a46-32414bbab297-kube-api-access-dqjnz\") pod \"e5070e42-5591-4136-8a46-32414bbab297\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " Apr 20 17:52:11.818620 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.818540 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-console-config\") pod \"e5070e42-5591-4136-8a46-32414bbab297\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " Apr 20 17:52:11.818620 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.818596 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-service-ca\") pod \"e5070e42-5591-4136-8a46-32414bbab297\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " Apr 20 17:52:11.818800 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.818621 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-oauth-serving-cert\") pod \"e5070e42-5591-4136-8a46-32414bbab297\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " Apr 20 17:52:11.818800 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.818722 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-oauth-config\") pod \"e5070e42-5591-4136-8a46-32414bbab297\" (UID: \"e5070e42-5591-4136-8a46-32414bbab297\") " Apr 20 17:52:11.819039 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.819013 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-console-config" (OuterVolumeSpecName: "console-config") pod "e5070e42-5591-4136-8a46-32414bbab297" (UID: "e5070e42-5591-4136-8a46-32414bbab297"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:11.819039 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.819024 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-service-ca" (OuterVolumeSpecName: "service-ca") pod "e5070e42-5591-4136-8a46-32414bbab297" (UID: "e5070e42-5591-4136-8a46-32414bbab297"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:11.819193 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.819124 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e5070e42-5591-4136-8a46-32414bbab297" (UID: "e5070e42-5591-4136-8a46-32414bbab297"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:11.820806 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.820778 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e5070e42-5591-4136-8a46-32414bbab297" (UID: "e5070e42-5591-4136-8a46-32414bbab297"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:11.821069 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.821042 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e5070e42-5591-4136-8a46-32414bbab297" (UID: "e5070e42-5591-4136-8a46-32414bbab297"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:11.821159 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.821130 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5070e42-5591-4136-8a46-32414bbab297-kube-api-access-dqjnz" (OuterVolumeSpecName: "kube-api-access-dqjnz") pod "e5070e42-5591-4136-8a46-32414bbab297" (UID: "e5070e42-5591-4136-8a46-32414bbab297"). InnerVolumeSpecName "kube-api-access-dqjnz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:52:11.920243 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.920162 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-console-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:11.920243 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.920192 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-service-ca\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:11.920243 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.920202 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5070e42-5591-4136-8a46-32414bbab297-oauth-serving-cert\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:11.920243 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.920211 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-oauth-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:11.920243 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.920222 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5070e42-5591-4136-8a46-32414bbab297-console-serving-cert\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:11.920243 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:11.920230 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dqjnz\" (UniqueName: \"kubernetes.io/projected/e5070e42-5591-4136-8a46-32414bbab297-kube-api-access-dqjnz\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:12.414814 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.414787 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66cbfb5c77-pz5kk_e5070e42-5591-4136-8a46-32414bbab297/console/0.log" Apr 20 17:52:12.415245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.414832 2577 generic.go:358] "Generic (PLEG): container finished" podID="e5070e42-5591-4136-8a46-32414bbab297" containerID="2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17" exitCode=2 Apr 20 17:52:12.415245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.414866 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cbfb5c77-pz5kk" event={"ID":"e5070e42-5591-4136-8a46-32414bbab297","Type":"ContainerDied","Data":"2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17"} Apr 20 17:52:12.415245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.414894 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cbfb5c77-pz5kk" event={"ID":"e5070e42-5591-4136-8a46-32414bbab297","Type":"ContainerDied","Data":"1cfbdba3ae2dff80e92cfa599b4d4b97b8ac86e577a8e9a5c345e594a70bd425"} Apr 20 17:52:12.415245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.414899 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cbfb5c77-pz5kk" Apr 20 17:52:12.415245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.414918 2577 scope.go:117] "RemoveContainer" containerID="2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17" Apr 20 17:52:12.423106 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.423085 2577 scope.go:117] "RemoveContainer" containerID="2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17" Apr 20 17:52:12.423395 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:52:12.423373 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17\": container with ID starting with 2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17 not found: ID does not exist" containerID="2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17" Apr 20 17:52:12.423446 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.423407 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17"} err="failed to get container status \"2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17\": rpc error: code = NotFound desc = could not find container \"2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17\": container with ID starting with 2df5c94d98dcd5d8e231a4f3432df2b5674c2aa2b5244c4a9c1fbe7302bb5d17 not found: ID does not exist" Apr 20 17:52:12.436063 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.435976 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66cbfb5c77-pz5kk"] Apr 20 17:52:12.441361 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.441337 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66cbfb5c77-pz5kk"] Apr 20 17:52:12.695129 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:12.695051 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5070e42-5591-4136-8a46-32414bbab297" path="/var/lib/kubelet/pods/e5070e42-5591-4136-8a46-32414bbab297/volumes" Apr 20 17:52:30.582727 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:30.582684 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:52:30.585014 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:30.584995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14ea9252-57e5-4e09-9c9e-33d96e94d56f-metrics-certs\") pod \"network-metrics-daemon-skq27\" (UID: \"14ea9252-57e5-4e09-9c9e-33d96e94d56f\") " pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:52:30.595045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:30.595022 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddm2v\"" Apr 20 17:52:30.602829 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:30.602812 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-skq27" Apr 20 17:52:30.723015 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:30.722951 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-skq27"] Apr 20 17:52:30.726997 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:52:30.726951 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14ea9252_57e5_4e09_9c9e_33d96e94d56f.slice/crio-26cc35e7a230d79d84839f640e7c6844a3e83b6276c7a3bff29634ef04456eb9 WatchSource:0}: Error finding container 26cc35e7a230d79d84839f640e7c6844a3e83b6276c7a3bff29634ef04456eb9: Status 404 returned error can't find the container with id 26cc35e7a230d79d84839f640e7c6844a3e83b6276c7a3bff29634ef04456eb9 Apr 20 17:52:31.477276 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:31.477230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-skq27" event={"ID":"14ea9252-57e5-4e09-9c9e-33d96e94d56f","Type":"ContainerStarted","Data":"26cc35e7a230d79d84839f640e7c6844a3e83b6276c7a3bff29634ef04456eb9"} Apr 20 17:52:32.481628 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:32.481552 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-skq27" event={"ID":"14ea9252-57e5-4e09-9c9e-33d96e94d56f","Type":"ContainerStarted","Data":"4b78a0310aef9d045cd7837cb6f166808fdad23e46643755ba27798d3ea6efa2"} Apr 20 17:52:32.481628 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:32.481589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-skq27" event={"ID":"14ea9252-57e5-4e09-9c9e-33d96e94d56f","Type":"ContainerStarted","Data":"c5eeaad672536123d68410ce85a03e0658754f32004f257df3508d5b67f86f39"} Apr 20 17:52:32.504363 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:32.504268 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-skq27" podStartSLOduration=253.565232466 podStartE2EDuration="4m14.504250811s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:52:30.728820147 +0000 UTC m=+252.514175380" lastFinishedPulling="2026-04-20 17:52:31.667838508 +0000 UTC m=+253.453193725" observedRunningTime="2026-04-20 17:52:32.504128351 +0000 UTC m=+254.289483589" watchObservedRunningTime="2026-04-20 17:52:32.504250811 +0000 UTC m=+254.289606061" Apr 20 17:52:42.496763 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:42.496726 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 17:52:42.497305 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:42.497261 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="alertmanager" containerID="cri-o://b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82" gracePeriod=120 Apr 20 17:52:42.497368 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:42.497302 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy-metric" containerID="cri-o://5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d" gracePeriod=120 Apr 20 17:52:42.497419 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:42.497356 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="prom-label-proxy" containerID="cri-o://dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb" gracePeriod=120 Apr 20 17:52:42.497419 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:42.497395 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy-web" containerID="cri-o://bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab" gracePeriod=120 Apr 20 17:52:42.497517 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:42.497424 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="config-reloader" containerID="cri-o://1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae" gracePeriod=120 Apr 20 17:52:42.497517 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:42.497409 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy" containerID="cri-o://7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5" gracePeriod=120 Apr 20 17:52:43.518757 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.518721 2577 generic.go:358] "Generic (PLEG): container finished" podID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerID="dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb" exitCode=0 Apr 20 17:52:43.518757 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.518747 2577 generic.go:358] "Generic (PLEG): container finished" podID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerID="7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5" exitCode=0 Apr 20 17:52:43.518757 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.518754 2577 generic.go:358] "Generic (PLEG): container finished" podID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerID="1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae" exitCode=0 Apr 20 17:52:43.518757 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.518760 2577 generic.go:358] "Generic (PLEG): container finished" podID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerID="b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82" exitCode=0 Apr 20 17:52:43.519262 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.518792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerDied","Data":"dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb"} Apr 20 17:52:43.519262 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.518825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerDied","Data":"7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5"} Apr 20 17:52:43.519262 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.518834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerDied","Data":"1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae"} Apr 20 17:52:43.519262 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.518846 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerDied","Data":"b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82"} Apr 20 17:52:43.734757 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.734732 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:43.785489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785406 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-trusted-ca-bundle\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.785489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785457 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-volume\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.785706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785514 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.785706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785553 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-web-config\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.785706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785584 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg6h6\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-kube-api-access-tg6h6\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.785706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785621 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-main-db\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.785706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785649 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-tls-assets\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.785706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785682 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.785706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785703 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-out\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.786097 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785726 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-cluster-tls-config\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.786097 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785744 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-web\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.786097 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785764 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-metrics-client-ca\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.786097 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785796 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-main-tls\") pod \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\" (UID: \"dd5c0166-30aa-4c85-84ad-28d9a7223bf1\") " Apr 20 17:52:43.786097 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.785839 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:43.786097 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.786054 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.787207 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.787172 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:52:43.788132 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.788083 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:43.788675 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.788638 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:43.788895 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.788860 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-volume" (OuterVolumeSpecName: "config-volume") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:43.789670 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.789631 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:43.790415 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.790389 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:43.791907 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.791879 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:43.792004 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.791891 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-out" (OuterVolumeSpecName: "config-out") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:52:43.792530 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.792510 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:52:43.792594 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.792565 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-kube-api-access-tg6h6" (OuterVolumeSpecName: "kube-api-access-tg6h6") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "kube-api-access-tg6h6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:52:43.794661 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.794635 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:43.802321 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.802291 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-web-config" (OuterVolumeSpecName: "web-config") pod "dd5c0166-30aa-4c85-84ad-28d9a7223bf1" (UID: "dd5c0166-30aa-4c85-84ad-28d9a7223bf1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:43.886646 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886600 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-tls-assets\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886646 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886637 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886646 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886648 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-out\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886646 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886658 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-cluster-tls-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886667 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886678 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-metrics-client-ca\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886687 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-main-tls\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886702 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-config-volume\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886711 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886721 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-web-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886730 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tg6h6\" (UniqueName: \"kubernetes.io/projected/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-kube-api-access-tg6h6\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:43.886913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:43.886738 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dd5c0166-30aa-4c85-84ad-28d9a7223bf1-alertmanager-main-db\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:52:44.525016 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.524968 2577 generic.go:358] "Generic (PLEG): container finished" podID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerID="5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d" exitCode=0 Apr 20 17:52:44.525016 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.525006 2577 generic.go:358] "Generic (PLEG): container finished" podID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerID="bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab" exitCode=0 Apr 20 17:52:44.525471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.525059 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerDied","Data":"5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d"} Apr 20 17:52:44.525471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.525085 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.525471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.525105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerDied","Data":"bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab"} Apr 20 17:52:44.525471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.525117 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dd5c0166-30aa-4c85-84ad-28d9a7223bf1","Type":"ContainerDied","Data":"eccc7b9785ff358da15f9ad9a2595b400fd846194755eb34dae54af1f8caf3f4"} Apr 20 17:52:44.525471 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.525133 2577 scope.go:117] "RemoveContainer" containerID="dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb" Apr 20 17:52:44.532617 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.532521 2577 scope.go:117] "RemoveContainer" containerID="5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d" Apr 20 17:52:44.539233 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.539215 2577 scope.go:117] "RemoveContainer" containerID="7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5" Apr 20 17:52:44.545393 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.545379 2577 scope.go:117] "RemoveContainer" containerID="bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab" Apr 20 17:52:44.554023 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.553422 2577 scope.go:117] "RemoveContainer" containerID="1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae" Apr 20 17:52:44.559588 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.559551 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 17:52:44.560771 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.560748 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 17:52:44.565179 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.565159 2577 scope.go:117] "RemoveContainer" containerID="b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82" Apr 20 17:52:44.571770 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.571750 2577 scope.go:117] "RemoveContainer" containerID="5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746" Apr 20 17:52:44.578248 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.578233 2577 scope.go:117] "RemoveContainer" containerID="dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb" Apr 20 17:52:44.578506 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:52:44.578478 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb\": container with ID starting with dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb not found: ID does not exist" containerID="dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb" Apr 20 17:52:44.578577 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.578511 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb"} err="failed to get container status \"dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb\": rpc error: code = NotFound desc = could not find container \"dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb\": container with ID starting with dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb not found: ID does not exist" Apr 20 17:52:44.578577 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.578538 2577 scope.go:117] "RemoveContainer" containerID="5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d" Apr 20 17:52:44.578785 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:52:44.578765 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d\": container with ID starting with 5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d not found: ID does not exist" containerID="5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d" Apr 20 17:52:44.578830 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.578792 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d"} err="failed to get container status \"5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d\": rpc error: code = NotFound desc = could not find container \"5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d\": container with ID starting with 5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d not found: ID does not exist" Apr 20 17:52:44.578830 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.578810 2577 scope.go:117] "RemoveContainer" containerID="7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5" Apr 20 17:52:44.579051 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:52:44.579034 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5\": container with ID starting with 7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5 not found: ID does not exist" containerID="7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5" Apr 20 17:52:44.579104 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579057 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5"} err="failed to get container status \"7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5\": rpc error: code = NotFound desc = could not find container \"7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5\": container with ID starting with 7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5 not found: ID does not exist" Apr 20 17:52:44.579104 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579071 2577 scope.go:117] "RemoveContainer" containerID="bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab" Apr 20 17:52:44.579273 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:52:44.579259 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab\": container with ID starting with bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab not found: ID does not exist" containerID="bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab" Apr 20 17:52:44.579312 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579278 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab"} err="failed to get container status \"bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab\": rpc error: code = NotFound desc = could not find container \"bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab\": container with ID starting with bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab not found: ID does not exist" Apr 20 17:52:44.579312 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579292 2577 scope.go:117] "RemoveContainer" containerID="1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae" Apr 20 17:52:44.579478 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:52:44.579462 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae\": container with ID starting with 1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae not found: ID does not exist" containerID="1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae" Apr 20 17:52:44.579518 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579480 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae"} err="failed to get container status \"1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae\": rpc error: code = NotFound desc = could not find container \"1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae\": container with ID starting with 1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae not found: ID does not exist" Apr 20 17:52:44.579518 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579493 2577 scope.go:117] "RemoveContainer" containerID="b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82" Apr 20 17:52:44.579717 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:52:44.579702 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82\": container with ID starting with b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82 not found: ID does not exist" containerID="b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82" Apr 20 17:52:44.579763 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579719 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82"} err="failed to get container status \"b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82\": rpc error: code = NotFound desc = could not find container \"b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82\": container with ID starting with b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82 not found: ID does not exist" Apr 20 17:52:44.579763 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579730 2577 scope.go:117] "RemoveContainer" containerID="5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746" Apr 20 17:52:44.579915 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:52:44.579899 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746\": container with ID starting with 5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746 not found: ID does not exist" containerID="5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746" Apr 20 17:52:44.579952 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579920 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746"} err="failed to get container status \"5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746\": rpc error: code = NotFound desc = could not find container \"5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746\": container with ID starting with 5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746 not found: ID does not exist" Apr 20 17:52:44.579952 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.579933 2577 scope.go:117] "RemoveContainer" containerID="dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb" Apr 20 17:52:44.580156 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.580140 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb"} err="failed to get container status \"dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb\": rpc error: code = NotFound desc = could not find container \"dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb\": container with ID starting with dccd87944f467ddb5529cfcedb8edff647919465ae664588de2062994aa283cb not found: ID does not exist" Apr 20 17:52:44.580156 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.580154 2577 scope.go:117] "RemoveContainer" containerID="5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d" Apr 20 17:52:44.580469 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.580349 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d"} err="failed to get container status \"5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d\": rpc error: code = NotFound desc = could not find container \"5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d\": container with ID starting with 5d2c2d2ae8c6b60aced4bb0154e81a06dbb0b8cae35c4c440ff099cb5905ba5d not found: ID does not exist" Apr 20 17:52:44.580469 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.580364 2577 scope.go:117] "RemoveContainer" containerID="7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5" Apr 20 17:52:44.580576 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.580550 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5"} err="failed to get container status \"7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5\": rpc error: code = NotFound desc = could not find container \"7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5\": container with ID starting with 7874aff10a607af660bc5f6e50b16ad13abbf2990b2e2cc5dbdf07e95c74cbd5 not found: ID does not exist" Apr 20 17:52:44.580623 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.580576 2577 scope.go:117] "RemoveContainer" containerID="bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab" Apr 20 17:52:44.580810 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.580794 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab"} err="failed to get container status \"bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab\": rpc error: code = NotFound desc = could not find container \"bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab\": container with ID starting with bc2cb2f181dc769c9de5dd9e1dc51a73eb37fe24796f167953e33bd9d63083ab not found: ID does not exist" Apr 20 17:52:44.580865 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.580811 2577 scope.go:117] "RemoveContainer" containerID="1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae" Apr 20 17:52:44.581055 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.581038 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae"} err="failed to get container status \"1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae\": rpc error: code = NotFound desc = could not find container \"1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae\": container with ID starting with 1ffc34fbed5864615a124516be84f7b302edc78a20f7015d87eb3b291470e0ae not found: ID does not exist" Apr 20 17:52:44.581099 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.581058 2577 scope.go:117] "RemoveContainer" containerID="b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82" Apr 20 17:52:44.581246 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.581228 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82"} err="failed to get container status \"b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82\": rpc error: code = NotFound desc = could not find container \"b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82\": container with ID starting with b3710f6d59a642baed1b632fd960de63bcdb5d666ea65457ca73a0ae36a9fd82 not found: ID does not exist" Apr 20 17:52:44.581285 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.581246 2577 scope.go:117] "RemoveContainer" containerID="5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746" Apr 20 17:52:44.581499 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.581482 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746"} err="failed to get container status \"5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746\": rpc error: code = NotFound desc = could not find container \"5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746\": container with ID starting with 5e7d081f6473c372549f7e10ca5d9b5d8d9dd723708d04c524e73b202ba09746 not found: ID does not exist" Apr 20 17:52:44.585578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585559 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 17:52:44.585880 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585867 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="init-config-reloader" Apr 20 17:52:44.585924 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585882 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="init-config-reloader" Apr 20 17:52:44.585924 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585894 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy-metric" Apr 20 17:52:44.585924 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585900 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy-metric" Apr 20 17:52:44.585924 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585909 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy-web" Apr 20 17:52:44.585924 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585914 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy-web" Apr 20 17:52:44.585924 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585919 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="prom-label-proxy" Apr 20 17:52:44.585924 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585926 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="prom-label-proxy" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585937 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585942 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585947 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5070e42-5591-4136-8a46-32414bbab297" containerName="console" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585952 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5070e42-5591-4136-8a46-32414bbab297" containerName="console" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585961 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="alertmanager" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585966 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="alertmanager" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585972 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="config-reloader" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.585977 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="config-reloader" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.586039 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.586047 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="prom-label-proxy" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.586054 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy-metric" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.586061 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="alertmanager" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.586067 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5070e42-5591-4136-8a46-32414bbab297" containerName="console" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.586073 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="kube-rbac-proxy-web" Apr 20 17:52:44.586141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.586079 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" containerName="config-reloader" Apr 20 17:52:44.589339 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.589325 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.591868 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.591845 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 17:52:44.591968 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.591890 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 17:52:44.592052 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.591964 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 17:52:44.592726 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.592262 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-qvclr\"" Apr 20 17:52:44.592726 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.592363 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 17:52:44.592726 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.592617 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 17:52:44.592726 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.592630 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 17:52:44.592726 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.592638 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 17:52:44.593035 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.592766 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 17:52:44.596886 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.596860 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 17:52:44.604495 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.604474 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 17:52:44.692827 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.692792 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4303a933-fab3-447a-9e7c-56cb1ac05945-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693031 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.692835 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693031 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.692876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4303a933-fab3-447a-9e7c-56cb1ac05945-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693031 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.692896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4303a933-fab3-447a-9e7c-56cb1ac05945-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693031 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.692919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693031 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.693012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4303a933-fab3-447a-9e7c-56cb1ac05945-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693232 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.693048 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62cd4\" (UniqueName: \"kubernetes.io/projected/4303a933-fab3-447a-9e7c-56cb1ac05945-kube-api-access-62cd4\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693232 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.693071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693232 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.693147 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693232 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.693222 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693373 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.693257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-web-config\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693373 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.693284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-config-volume\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.693373 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.693312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4303a933-fab3-447a-9e7c-56cb1ac05945-config-out\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.695521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.695501 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5c0166-30aa-4c85-84ad-28d9a7223bf1" path="/var/lib/kubelet/pods/dd5c0166-30aa-4c85-84ad-28d9a7223bf1/volumes" Apr 20 17:52:44.794291 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794291 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794291 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-web-config\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794291 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-config-volume\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4303a933-fab3-447a-9e7c-56cb1ac05945-config-out\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4303a933-fab3-447a-9e7c-56cb1ac05945-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4303a933-fab3-447a-9e7c-56cb1ac05945-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4303a933-fab3-447a-9e7c-56cb1ac05945-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794616 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794911 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4303a933-fab3-447a-9e7c-56cb1ac05945-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794911 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62cd4\" (UniqueName: \"kubernetes.io/projected/4303a933-fab3-447a-9e7c-56cb1ac05945-kube-api-access-62cd4\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.794911 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.794710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.795498 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.795409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4303a933-fab3-447a-9e7c-56cb1ac05945-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.795498 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.795423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4303a933-fab3-447a-9e7c-56cb1ac05945-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.797417 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.797302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4303a933-fab3-447a-9e7c-56cb1ac05945-config-out\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.797417 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.797346 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.797417 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.797404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.797614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.797433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.797614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.797576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4303a933-fab3-447a-9e7c-56cb1ac05945-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.798053 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.798036 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.798136 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.798097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-config-volume\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.798136 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.798120 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4303a933-fab3-447a-9e7c-56cb1ac05945-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.798366 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.798349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-web-config\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.799331 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.799309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4303a933-fab3-447a-9e7c-56cb1ac05945-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.804933 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.804910 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62cd4\" (UniqueName: \"kubernetes.io/projected/4303a933-fab3-447a-9e7c-56cb1ac05945-kube-api-access-62cd4\") pod \"alertmanager-main-0\" (UID: \"4303a933-fab3-447a-9e7c-56cb1ac05945\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:44.899677 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:44.899642 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 17:52:45.050367 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:45.050343 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 17:52:45.052394 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:52:45.052368 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4303a933_fab3_447a_9e7c_56cb1ac05945.slice/crio-f9c6f1b020196db5bf3ac2cc26b3421fedf03799c5e1a3f4b7dd0fd76bfb6ffb WatchSource:0}: Error finding container f9c6f1b020196db5bf3ac2cc26b3421fedf03799c5e1a3f4b7dd0fd76bfb6ffb: Status 404 returned error can't find the container with id f9c6f1b020196db5bf3ac2cc26b3421fedf03799c5e1a3f4b7dd0fd76bfb6ffb Apr 20 17:52:45.530224 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:45.530189 2577 generic.go:358] "Generic (PLEG): container finished" podID="4303a933-fab3-447a-9e7c-56cb1ac05945" containerID="f5a400352bd086a024ba64bcb4fb15681cd15699016c4c08b6fc188fa9c075db" exitCode=0 Apr 20 17:52:45.530592 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:45.530232 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4303a933-fab3-447a-9e7c-56cb1ac05945","Type":"ContainerDied","Data":"f5a400352bd086a024ba64bcb4fb15681cd15699016c4c08b6fc188fa9c075db"} Apr 20 17:52:45.530592 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:45.530255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4303a933-fab3-447a-9e7c-56cb1ac05945","Type":"ContainerStarted","Data":"f9c6f1b020196db5bf3ac2cc26b3421fedf03799c5e1a3f4b7dd0fd76bfb6ffb"} Apr 20 17:52:46.537172 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.537136 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4303a933-fab3-447a-9e7c-56cb1ac05945","Type":"ContainerStarted","Data":"df7b0d9d658802d7406252ddbfb1383d052fad8afb5c02e394ca58aa32581602"} Apr 20 17:52:46.537526 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.537178 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4303a933-fab3-447a-9e7c-56cb1ac05945","Type":"ContainerStarted","Data":"93fbaaa85672d0628899cb5b9036ec61c177bf05b4c1eaa93cfc5cd1006f4d12"} Apr 20 17:52:46.537526 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.537195 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4303a933-fab3-447a-9e7c-56cb1ac05945","Type":"ContainerStarted","Data":"76468a656ce0bcb102cb3567fb2473a9fcb455437651f65a380b1177b3bf82e7"} Apr 20 17:52:46.537526 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.537208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4303a933-fab3-447a-9e7c-56cb1ac05945","Type":"ContainerStarted","Data":"3a0e09cbb91c52fc0732e4038af551d5ca806f33ca0cc7828e99ea41e9e47ec5"} Apr 20 17:52:46.537526 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.537221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4303a933-fab3-447a-9e7c-56cb1ac05945","Type":"ContainerStarted","Data":"89331f27c0b1784d8f731d971ed19bdb0fb3fa63a856570cd9b6bffd3b6e77bb"} Apr 20 17:52:46.537526 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.537234 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4303a933-fab3-447a-9e7c-56cb1ac05945","Type":"ContainerStarted","Data":"81f6c2d7b0db8dffde748af495f53bbc669d70741d1ab8757f3d6c4c7c886b20"} Apr 20 17:52:46.560615 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.560576 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-b54957944-z5ggp"] Apr 20 17:52:46.568581 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.567832 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.570342 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.570311 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 17:52:46.570770 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.570748 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 17:52:46.571052 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.570980 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.570966003 podStartE2EDuration="2.570966003s" podCreationTimestamp="2026-04-20 17:52:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:52:46.569598209 +0000 UTC m=+268.354953471" watchObservedRunningTime="2026-04-20 17:52:46.570966003 +0000 UTC m=+268.356321243" Apr 20 17:52:46.571216 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.571192 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 17:52:46.571294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.571239 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 17:52:46.571345 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.571317 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5tdmh\"" Apr 20 17:52:46.571544 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.571523 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 17:52:46.578771 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.578751 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 17:52:46.578875 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.578833 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b54957944-z5ggp"] Apr 20 17:52:46.610482 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.610448 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/297b207e-fe93-479b-8abb-11c125da9ff9-metrics-client-ca\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.610671 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.610506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-federate-client-tls\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.610671 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.610591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297b207e-fe93-479b-8abb-11c125da9ff9-serving-certs-ca-bundle\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.610671 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.610619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297b207e-fe93-479b-8abb-11c125da9ff9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.610836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.610723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.610836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.610779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-secret-telemeter-client\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.610836 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.610801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p24dn\" (UniqueName: \"kubernetes.io/projected/297b207e-fe93-479b-8abb-11c125da9ff9-kube-api-access-p24dn\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.610974 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.610836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-telemeter-client-tls\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.711628 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.711591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-federate-client-tls\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.711804 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.711643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297b207e-fe93-479b-8abb-11c125da9ff9-serving-certs-ca-bundle\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.711804 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.711663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297b207e-fe93-479b-8abb-11c125da9ff9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.711804 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.711697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.711804 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.711718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-secret-telemeter-client\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.711804 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.711735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p24dn\" (UniqueName: \"kubernetes.io/projected/297b207e-fe93-479b-8abb-11c125da9ff9-kube-api-access-p24dn\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.711804 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.711765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-telemeter-client-tls\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.712173 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.711821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/297b207e-fe93-479b-8abb-11c125da9ff9-metrics-client-ca\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.712347 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.712322 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297b207e-fe93-479b-8abb-11c125da9ff9-serving-certs-ca-bundle\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.712438 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.712395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/297b207e-fe93-479b-8abb-11c125da9ff9-metrics-client-ca\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.712696 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.712667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/297b207e-fe93-479b-8abb-11c125da9ff9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.715021 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.714999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.715191 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.715146 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-telemeter-client-tls\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.715191 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.715159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-secret-telemeter-client\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.715858 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.715838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/297b207e-fe93-479b-8abb-11c125da9ff9-federate-client-tls\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.720070 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.719980 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p24dn\" (UniqueName: \"kubernetes.io/projected/297b207e-fe93-479b-8abb-11c125da9ff9-kube-api-access-p24dn\") pod \"telemeter-client-b54957944-z5ggp\" (UID: \"297b207e-fe93-479b-8abb-11c125da9ff9\") " pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:46.880648 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:46.880553 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" Apr 20 17:52:47.041132 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:47.041089 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b54957944-z5ggp"] Apr 20 17:52:47.045204 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:52:47.045175 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod297b207e_fe93_479b_8abb_11c125da9ff9.slice/crio-69993f1eead902100a257655085e3042257a2e1398b742122bfa4606d9ac657f WatchSource:0}: Error finding container 69993f1eead902100a257655085e3042257a2e1398b742122bfa4606d9ac657f: Status 404 returned error can't find the container with id 69993f1eead902100a257655085e3042257a2e1398b742122bfa4606d9ac657f Apr 20 17:52:47.541230 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:47.541190 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" event={"ID":"297b207e-fe93-479b-8abb-11c125da9ff9","Type":"ContainerStarted","Data":"69993f1eead902100a257655085e3042257a2e1398b742122bfa4606d9ac657f"} Apr 20 17:52:49.548698 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:49.548665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" event={"ID":"297b207e-fe93-479b-8abb-11c125da9ff9","Type":"ContainerStarted","Data":"b17db42707b30e29b4e46b0ba32e1ebf71f7e53a3c0c6573d94956418c11efa9"} Apr 20 17:52:49.548698 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:49.548704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" event={"ID":"297b207e-fe93-479b-8abb-11c125da9ff9","Type":"ContainerStarted","Data":"e77d70dc9c19c5c88ab6979a225f0f0a9ecd5715664b98e47e9769c318609a2d"} Apr 20 17:52:49.549224 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:49.548714 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" event={"ID":"297b207e-fe93-479b-8abb-11c125da9ff9","Type":"ContainerStarted","Data":"5172f93140177df577244d0bf7a8767b7f90b581b8b803cf9b6687a1e72e694e"} Apr 20 17:52:49.572183 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:49.572124 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-b54957944-z5ggp" podStartSLOduration=1.4537671699999999 podStartE2EDuration="3.572108846s" podCreationTimestamp="2026-04-20 17:52:46 +0000 UTC" firstStartedPulling="2026-04-20 17:52:47.047116231 +0000 UTC m=+268.832471448" lastFinishedPulling="2026-04-20 17:52:49.165457892 +0000 UTC m=+270.950813124" observedRunningTime="2026-04-20 17:52:49.570888147 +0000 UTC m=+271.356243396" watchObservedRunningTime="2026-04-20 17:52:49.572108846 +0000 UTC m=+271.357464084" Apr 20 17:52:50.227069 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.227039 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b4c75dbb-xndq9"] Apr 20 17:52:50.230326 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.230309 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.245705 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.245681 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b4c75dbb-xndq9"] Apr 20 17:52:50.345577 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.345539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-config\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.345577 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.345581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-oauth-config\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.345847 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.345648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-trusted-ca-bundle\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.345847 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.345708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck7d7\" (UniqueName: \"kubernetes.io/projected/10cde242-afa1-4a1c-8c5d-2b47d99daecd-kube-api-access-ck7d7\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.345847 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.345803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-serving-cert\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.345847 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.345829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-service-ca\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.346050 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.345858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-oauth-serving-cert\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.446950 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.446900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-serving-cert\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.447139 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.446968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-service-ca\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.447139 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.447015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-oauth-serving-cert\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.447139 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.447061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-config\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.447139 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.447091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-oauth-config\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.447378 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.447149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-trusted-ca-bundle\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.447378 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.447200 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck7d7\" (UniqueName: \"kubernetes.io/projected/10cde242-afa1-4a1c-8c5d-2b47d99daecd-kube-api-access-ck7d7\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.447850 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.447822 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-service-ca\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.447970 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.447844 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-oauth-serving-cert\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.447970 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.447850 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-config\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.448114 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.447971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-trusted-ca-bundle\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.449446 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.449415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-serving-cert\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.449540 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.449512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-oauth-config\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.455740 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.455720 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck7d7\" (UniqueName: \"kubernetes.io/projected/10cde242-afa1-4a1c-8c5d-2b47d99daecd-kube-api-access-ck7d7\") pod \"console-7b4c75dbb-xndq9\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.539048 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.538945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:52:50.685966 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:50.685943 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b4c75dbb-xndq9"] Apr 20 17:52:50.688955 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:52:50.688925 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10cde242_afa1_4a1c_8c5d_2b47d99daecd.slice/crio-3b4b8b8c06b8212a92a3a8efd4ae4faf11be58c3fbdb345e23cadbb2e0177614 WatchSource:0}: Error finding container 3b4b8b8c06b8212a92a3a8efd4ae4faf11be58c3fbdb345e23cadbb2e0177614: Status 404 returned error can't find the container with id 3b4b8b8c06b8212a92a3a8efd4ae4faf11be58c3fbdb345e23cadbb2e0177614 Apr 20 17:52:51.560204 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:51.560161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4c75dbb-xndq9" event={"ID":"10cde242-afa1-4a1c-8c5d-2b47d99daecd","Type":"ContainerStarted","Data":"45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae"} Apr 20 17:52:51.560204 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:51.560208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4c75dbb-xndq9" event={"ID":"10cde242-afa1-4a1c-8c5d-2b47d99daecd","Type":"ContainerStarted","Data":"3b4b8b8c06b8212a92a3a8efd4ae4faf11be58c3fbdb345e23cadbb2e0177614"} Apr 20 17:52:51.578151 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:52:51.578094 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b4c75dbb-xndq9" podStartSLOduration=1.578080658 podStartE2EDuration="1.578080658s" podCreationTimestamp="2026-04-20 17:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:52:51.576790438 +0000 UTC m=+273.362145678" watchObservedRunningTime="2026-04-20 17:52:51.578080658 +0000 UTC m=+273.363435897" Apr 20 17:53:00.539390 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:00.539331 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:53:00.539390 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:00.539407 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:53:00.544004 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:00.543969 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:53:00.590169 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:00.590142 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:53:00.636334 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:00.636302 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f66cdd75b-lwp8g"] Apr 20 17:53:18.613672 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:18.613644 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 17:53:18.614167 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:18.614148 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 17:53:18.619748 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:18.619731 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 17:53:25.656619 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.656569 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f66cdd75b-lwp8g" podUID="acb369d6-c0db-4b6a-bb34-17bc911f2932" containerName="console" containerID="cri-o://6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132" gracePeriod=15 Apr 20 17:53:25.893937 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.893914 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f66cdd75b-lwp8g_acb369d6-c0db-4b6a-bb34-17bc911f2932/console/0.log" Apr 20 17:53:25.894091 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.893972 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:53:25.951603 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.951567 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-trusted-ca-bundle\") pod \"acb369d6-c0db-4b6a-bb34-17bc911f2932\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " Apr 20 17:53:25.951766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.951611 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snsnw\" (UniqueName: \"kubernetes.io/projected/acb369d6-c0db-4b6a-bb34-17bc911f2932-kube-api-access-snsnw\") pod \"acb369d6-c0db-4b6a-bb34-17bc911f2932\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " Apr 20 17:53:25.951766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.951640 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-oauth-serving-cert\") pod \"acb369d6-c0db-4b6a-bb34-17bc911f2932\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " Apr 20 17:53:25.951766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.951671 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-serving-cert\") pod \"acb369d6-c0db-4b6a-bb34-17bc911f2932\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " Apr 20 17:53:25.951766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.951697 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-oauth-config\") pod \"acb369d6-c0db-4b6a-bb34-17bc911f2932\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " Apr 20 17:53:25.951766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.951753 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-config\") pod \"acb369d6-c0db-4b6a-bb34-17bc911f2932\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " Apr 20 17:53:25.952133 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.951813 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-service-ca\") pod \"acb369d6-c0db-4b6a-bb34-17bc911f2932\" (UID: \"acb369d6-c0db-4b6a-bb34-17bc911f2932\") " Apr 20 17:53:25.952204 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.952039 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "acb369d6-c0db-4b6a-bb34-17bc911f2932" (UID: "acb369d6-c0db-4b6a-bb34-17bc911f2932"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:53:25.952204 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.952160 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "acb369d6-c0db-4b6a-bb34-17bc911f2932" (UID: "acb369d6-c0db-4b6a-bb34-17bc911f2932"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:53:25.952356 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.952327 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-config" (OuterVolumeSpecName: "console-config") pod "acb369d6-c0db-4b6a-bb34-17bc911f2932" (UID: "acb369d6-c0db-4b6a-bb34-17bc911f2932"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:53:25.952440 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.952366 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-service-ca" (OuterVolumeSpecName: "service-ca") pod "acb369d6-c0db-4b6a-bb34-17bc911f2932" (UID: "acb369d6-c0db-4b6a-bb34-17bc911f2932"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:53:25.953918 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.953889 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "acb369d6-c0db-4b6a-bb34-17bc911f2932" (UID: "acb369d6-c0db-4b6a-bb34-17bc911f2932"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:53:25.954035 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.953934 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "acb369d6-c0db-4b6a-bb34-17bc911f2932" (UID: "acb369d6-c0db-4b6a-bb34-17bc911f2932"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:53:25.954035 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:25.953949 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb369d6-c0db-4b6a-bb34-17bc911f2932-kube-api-access-snsnw" (OuterVolumeSpecName: "kube-api-access-snsnw") pod "acb369d6-c0db-4b6a-bb34-17bc911f2932" (UID: "acb369d6-c0db-4b6a-bb34-17bc911f2932"). InnerVolumeSpecName "kube-api-access-snsnw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:53:26.052419 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.052385 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:53:26.052419 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.052413 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-service-ca\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:53:26.052419 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.052424 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-trusted-ca-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:53:26.052648 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.052433 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snsnw\" (UniqueName: \"kubernetes.io/projected/acb369d6-c0db-4b6a-bb34-17bc911f2932-kube-api-access-snsnw\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:53:26.052648 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.052443 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb369d6-c0db-4b6a-bb34-17bc911f2932-oauth-serving-cert\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:53:26.052648 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.052452 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-serving-cert\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:53:26.052648 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.052461 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb369d6-c0db-4b6a-bb34-17bc911f2932-console-oauth-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:53:26.666929 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.666903 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f66cdd75b-lwp8g_acb369d6-c0db-4b6a-bb34-17bc911f2932/console/0.log" Apr 20 17:53:26.667333 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.666944 2577 generic.go:358] "Generic (PLEG): container finished" podID="acb369d6-c0db-4b6a-bb34-17bc911f2932" containerID="6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132" exitCode=2 Apr 20 17:53:26.667333 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.667031 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f66cdd75b-lwp8g" Apr 20 17:53:26.667333 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.667034 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f66cdd75b-lwp8g" event={"ID":"acb369d6-c0db-4b6a-bb34-17bc911f2932","Type":"ContainerDied","Data":"6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132"} Apr 20 17:53:26.667333 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.667071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f66cdd75b-lwp8g" event={"ID":"acb369d6-c0db-4b6a-bb34-17bc911f2932","Type":"ContainerDied","Data":"df036202315bac546577017d6e044241a11b663e893d46b8522ee8087452106c"} Apr 20 17:53:26.667333 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.667087 2577 scope.go:117] "RemoveContainer" containerID="6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132" Apr 20 17:53:26.675578 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.675556 2577 scope.go:117] "RemoveContainer" containerID="6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132" Apr 20 17:53:26.675829 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:53:26.675811 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132\": container with ID starting with 6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132 not found: ID does not exist" containerID="6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132" Apr 20 17:53:26.675891 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.675842 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132"} err="failed to get container status \"6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132\": rpc error: code = NotFound desc = could not find container \"6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132\": container with ID starting with 6a98a747425dd7b43f6782ec204f968447cb14e7f365c255526b343590676132 not found: ID does not exist" Apr 20 17:53:26.687407 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.687378 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f66cdd75b-lwp8g"] Apr 20 17:53:26.690544 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.690518 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f66cdd75b-lwp8g"] Apr 20 17:53:26.694758 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:53:26.694736 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb369d6-c0db-4b6a-bb34-17bc911f2932" path="/var/lib/kubelet/pods/acb369d6-c0db-4b6a-bb34-17bc911f2932/volumes" Apr 20 17:54:06.395316 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.395238 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-684869d99c-km9fq"] Apr 20 17:54:06.395737 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.395558 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acb369d6-c0db-4b6a-bb34-17bc911f2932" containerName="console" Apr 20 17:54:06.395737 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.395569 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb369d6-c0db-4b6a-bb34-17bc911f2932" containerName="console" Apr 20 17:54:06.395737 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.395618 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="acb369d6-c0db-4b6a-bb34-17bc911f2932" containerName="console" Apr 20 17:54:06.398517 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.398498 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.410786 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.410757 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-684869d99c-km9fq"] Apr 20 17:54:06.494218 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.494180 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-oauth-serving-cert\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.494218 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.494223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-serving-cert\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.494424 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.494247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-service-ca\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.494424 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.494347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-console-config\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.494424 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.494383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-trusted-ca-bundle\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.494517 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.494457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6zwk\" (UniqueName: \"kubernetes.io/projected/bcafce03-8ee3-495c-b3eb-147db98319b1-kube-api-access-w6zwk\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.494517 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.494497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-oauth-config\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.595853 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.595801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-console-config\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.595853 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.595852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-trusted-ca-bundle\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.596130 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.595889 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6zwk\" (UniqueName: \"kubernetes.io/projected/bcafce03-8ee3-495c-b3eb-147db98319b1-kube-api-access-w6zwk\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.596130 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.595931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-oauth-config\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.596130 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.596063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-oauth-serving-cert\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.596130 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.596104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-serving-cert\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.596372 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.596133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-service-ca\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.596699 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.596673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-console-config\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.596843 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.596765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-oauth-serving-cert\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.597011 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.596966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-service-ca\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.597091 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.597019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-trusted-ca-bundle\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.598453 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.598429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-oauth-config\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.598536 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.598494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-serving-cert\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.604672 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.604648 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6zwk\" (UniqueName: \"kubernetes.io/projected/bcafce03-8ee3-495c-b3eb-147db98319b1-kube-api-access-w6zwk\") pod \"console-684869d99c-km9fq\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.707912 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.707884 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:06.831892 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.831830 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-684869d99c-km9fq"] Apr 20 17:54:06.839284 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:54:06.839259 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcafce03_8ee3_495c_b3eb_147db98319b1.slice/crio-ead3dc2da8dd6675026418dcfb8b786caec9458dd126b385be1e4a4206a7760f WatchSource:0}: Error finding container ead3dc2da8dd6675026418dcfb8b786caec9458dd126b385be1e4a4206a7760f: Status 404 returned error can't find the container with id ead3dc2da8dd6675026418dcfb8b786caec9458dd126b385be1e4a4206a7760f Apr 20 17:54:06.843702 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:06.843682 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 17:54:07.784794 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:07.784753 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-684869d99c-km9fq" event={"ID":"bcafce03-8ee3-495c-b3eb-147db98319b1","Type":"ContainerStarted","Data":"921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5"} Apr 20 17:54:07.784794 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:07.784792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-684869d99c-km9fq" event={"ID":"bcafce03-8ee3-495c-b3eb-147db98319b1","Type":"ContainerStarted","Data":"ead3dc2da8dd6675026418dcfb8b786caec9458dd126b385be1e4a4206a7760f"} Apr 20 17:54:07.801649 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:07.801580 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-684869d99c-km9fq" podStartSLOduration=1.801564513 podStartE2EDuration="1.801564513s" podCreationTimestamp="2026-04-20 17:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:54:07.800852029 +0000 UTC m=+349.586207265" watchObservedRunningTime="2026-04-20 17:54:07.801564513 +0000 UTC m=+349.586919752" Apr 20 17:54:16.708775 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:16.708738 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:16.709198 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:16.708832 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:16.713663 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:16.713641 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:16.814867 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:16.814838 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:54:16.858441 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:16.858406 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b4c75dbb-xndq9"] Apr 20 17:54:41.878930 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:41.878880 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b4c75dbb-xndq9" podUID="10cde242-afa1-4a1c-8c5d-2b47d99daecd" containerName="console" containerID="cri-o://45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae" gracePeriod=15 Apr 20 17:54:42.117232 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.117210 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b4c75dbb-xndq9_10cde242-afa1-4a1c-8c5d-2b47d99daecd/console/0.log" Apr 20 17:54:42.117337 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.117269 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:54:42.179708 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.179621 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-service-ca\") pod \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " Apr 20 17:54:42.179708 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.179662 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-trusted-ca-bundle\") pod \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " Apr 20 17:54:42.179708 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.179696 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-oauth-config\") pod \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " Apr 20 17:54:42.179972 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.179715 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-serving-cert\") pod \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " Apr 20 17:54:42.179972 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.179734 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-config\") pod \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " Apr 20 17:54:42.179972 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.179838 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-oauth-serving-cert\") pod \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " Apr 20 17:54:42.179972 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.179906 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck7d7\" (UniqueName: \"kubernetes.io/projected/10cde242-afa1-4a1c-8c5d-2b47d99daecd-kube-api-access-ck7d7\") pod \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\" (UID: \"10cde242-afa1-4a1c-8c5d-2b47d99daecd\") " Apr 20 17:54:42.180189 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.180046 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-service-ca" (OuterVolumeSpecName: "service-ca") pod "10cde242-afa1-4a1c-8c5d-2b47d99daecd" (UID: "10cde242-afa1-4a1c-8c5d-2b47d99daecd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:54:42.180189 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.180157 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "10cde242-afa1-4a1c-8c5d-2b47d99daecd" (UID: "10cde242-afa1-4a1c-8c5d-2b47d99daecd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:54:42.180277 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.180178 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-config" (OuterVolumeSpecName: "console-config") pod "10cde242-afa1-4a1c-8c5d-2b47d99daecd" (UID: "10cde242-afa1-4a1c-8c5d-2b47d99daecd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:54:42.180277 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.180233 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "10cde242-afa1-4a1c-8c5d-2b47d99daecd" (UID: "10cde242-afa1-4a1c-8c5d-2b47d99daecd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:54:42.180343 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.180330 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-service-ca\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:54:42.180383 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.180343 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-trusted-ca-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:54:42.180383 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.180352 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:54:42.180383 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.180361 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10cde242-afa1-4a1c-8c5d-2b47d99daecd-oauth-serving-cert\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:54:42.182029 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.182002 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "10cde242-afa1-4a1c-8c5d-2b47d99daecd" (UID: "10cde242-afa1-4a1c-8c5d-2b47d99daecd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:54:42.182109 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.182077 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10cde242-afa1-4a1c-8c5d-2b47d99daecd-kube-api-access-ck7d7" (OuterVolumeSpecName: "kube-api-access-ck7d7") pod "10cde242-afa1-4a1c-8c5d-2b47d99daecd" (UID: "10cde242-afa1-4a1c-8c5d-2b47d99daecd"). InnerVolumeSpecName "kube-api-access-ck7d7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:54:42.182170 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.182153 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "10cde242-afa1-4a1c-8c5d-2b47d99daecd" (UID: "10cde242-afa1-4a1c-8c5d-2b47d99daecd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:54:42.281472 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.281437 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-oauth-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:54:42.281472 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.281467 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cde242-afa1-4a1c-8c5d-2b47d99daecd-console-serving-cert\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:54:42.281653 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.281481 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ck7d7\" (UniqueName: \"kubernetes.io/projected/10cde242-afa1-4a1c-8c5d-2b47d99daecd-kube-api-access-ck7d7\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:54:42.887204 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.887176 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b4c75dbb-xndq9_10cde242-afa1-4a1c-8c5d-2b47d99daecd/console/0.log" Apr 20 17:54:42.887664 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.887218 2577 generic.go:358] "Generic (PLEG): container finished" podID="10cde242-afa1-4a1c-8c5d-2b47d99daecd" containerID="45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae" exitCode=2 Apr 20 17:54:42.887664 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.887261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4c75dbb-xndq9" event={"ID":"10cde242-afa1-4a1c-8c5d-2b47d99daecd","Type":"ContainerDied","Data":"45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae"} Apr 20 17:54:42.887664 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.887287 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4c75dbb-xndq9" event={"ID":"10cde242-afa1-4a1c-8c5d-2b47d99daecd","Type":"ContainerDied","Data":"3b4b8b8c06b8212a92a3a8efd4ae4faf11be58c3fbdb345e23cadbb2e0177614"} Apr 20 17:54:42.887664 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.887311 2577 scope.go:117] "RemoveContainer" containerID="45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae" Apr 20 17:54:42.887664 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.887330 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4c75dbb-xndq9" Apr 20 17:54:42.895156 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.895137 2577 scope.go:117] "RemoveContainer" containerID="45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae" Apr 20 17:54:42.895415 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:54:42.895398 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae\": container with ID starting with 45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae not found: ID does not exist" containerID="45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae" Apr 20 17:54:42.895462 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.895424 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae"} err="failed to get container status \"45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae\": rpc error: code = NotFound desc = could not find container \"45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae\": container with ID starting with 45dfdb2338b3f0d5bebcdd5fc2bb524d42d2c7a6717f3e12f512a404876257ae not found: ID does not exist" Apr 20 17:54:42.903706 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.903684 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b4c75dbb-xndq9"] Apr 20 17:54:42.907326 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:42.907306 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b4c75dbb-xndq9"] Apr 20 17:54:44.695922 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:44.695891 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10cde242-afa1-4a1c-8c5d-2b47d99daecd" path="/var/lib/kubelet/pods/10cde242-afa1-4a1c-8c5d-2b47d99daecd/volumes" Apr 20 17:54:58.221953 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.221918 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc"] Apr 20 17:54:58.222337 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.222249 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10cde242-afa1-4a1c-8c5d-2b47d99daecd" containerName="console" Apr 20 17:54:58.222337 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.222261 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cde242-afa1-4a1c-8c5d-2b47d99daecd" containerName="console" Apr 20 17:54:58.222337 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.222312 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="10cde242-afa1-4a1c-8c5d-2b47d99daecd" containerName="console" Apr 20 17:54:58.226686 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.226668 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.229075 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.229048 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 17:54:58.229977 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.229962 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ds9v2\"" Apr 20 17:54:58.229977 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.229969 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 17:54:58.232936 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.232910 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc"] Apr 20 17:54:58.316329 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.316291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.316501 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.316369 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm874\" (UniqueName: \"kubernetes.io/projected/858417dc-cf03-4fe2-a342-9cbc1d94f665-kube-api-access-dm874\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.316501 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.316408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.417143 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.417106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.417343 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.417180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.417343 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.417249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dm874\" (UniqueName: \"kubernetes.io/projected/858417dc-cf03-4fe2-a342-9cbc1d94f665-kube-api-access-dm874\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.417492 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.417471 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.417549 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.417494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.425235 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.425213 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm874\" (UniqueName: \"kubernetes.io/projected/858417dc-cf03-4fe2-a342-9cbc1d94f665-kube-api-access-dm874\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.536391 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.536311 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:54:58.655568 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.655430 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc"] Apr 20 17:54:58.658170 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:54:58.658139 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858417dc_cf03_4fe2_a342_9cbc1d94f665.slice/crio-ddb02642b76930b52eab3f2f8d8ebd563d88e391b51a7e158e5fbbfea8e58567 WatchSource:0}: Error finding container ddb02642b76930b52eab3f2f8d8ebd563d88e391b51a7e158e5fbbfea8e58567: Status 404 returned error can't find the container with id ddb02642b76930b52eab3f2f8d8ebd563d88e391b51a7e158e5fbbfea8e58567 Apr 20 17:54:58.935807 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:54:58.935722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" event={"ID":"858417dc-cf03-4fe2-a342-9cbc1d94f665","Type":"ContainerStarted","Data":"ddb02642b76930b52eab3f2f8d8ebd563d88e391b51a7e158e5fbbfea8e58567"} Apr 20 17:55:04.959223 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:04.959185 2577 generic.go:358] "Generic (PLEG): container finished" podID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerID="24e0395751ca9dcf18d43c3be257836683918f08ebec81e564ecb100032e546a" exitCode=0 Apr 20 17:55:04.959695 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:04.959268 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" event={"ID":"858417dc-cf03-4fe2-a342-9cbc1d94f665","Type":"ContainerDied","Data":"24e0395751ca9dcf18d43c3be257836683918f08ebec81e564ecb100032e546a"} Apr 20 17:55:07.969148 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:07.969116 2577 generic.go:358] "Generic (PLEG): container finished" podID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerID="1e358f53d58ad23ef4dfd051385ea97e70b7124515917fb0dfa02bb531bcc54a" exitCode=0 Apr 20 17:55:07.969560 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:07.969199 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" event={"ID":"858417dc-cf03-4fe2-a342-9cbc1d94f665","Type":"ContainerDied","Data":"1e358f53d58ad23ef4dfd051385ea97e70b7124515917fb0dfa02bb531bcc54a"} Apr 20 17:55:15.999123 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:15.999085 2577 generic.go:358] "Generic (PLEG): container finished" podID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerID="1336fa8168333678ce722a4bd4343f27e5e6cebb839d2823a09638248a7fc461" exitCode=0 Apr 20 17:55:15.999618 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:15.999154 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" event={"ID":"858417dc-cf03-4fe2-a342-9cbc1d94f665","Type":"ContainerDied","Data":"1336fa8168333678ce722a4bd4343f27e5e6cebb839d2823a09638248a7fc461"} Apr 20 17:55:17.126653 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.126629 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:55:17.175100 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.175068 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-util\") pod \"858417dc-cf03-4fe2-a342-9cbc1d94f665\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " Apr 20 17:55:17.175279 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.175132 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-bundle\") pod \"858417dc-cf03-4fe2-a342-9cbc1d94f665\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " Apr 20 17:55:17.175279 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.175174 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm874\" (UniqueName: \"kubernetes.io/projected/858417dc-cf03-4fe2-a342-9cbc1d94f665-kube-api-access-dm874\") pod \"858417dc-cf03-4fe2-a342-9cbc1d94f665\" (UID: \"858417dc-cf03-4fe2-a342-9cbc1d94f665\") " Apr 20 17:55:17.175808 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.175782 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-bundle" (OuterVolumeSpecName: "bundle") pod "858417dc-cf03-4fe2-a342-9cbc1d94f665" (UID: "858417dc-cf03-4fe2-a342-9cbc1d94f665"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:55:17.177402 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.177365 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858417dc-cf03-4fe2-a342-9cbc1d94f665-kube-api-access-dm874" (OuterVolumeSpecName: "kube-api-access-dm874") pod "858417dc-cf03-4fe2-a342-9cbc1d94f665" (UID: "858417dc-cf03-4fe2-a342-9cbc1d94f665"). InnerVolumeSpecName "kube-api-access-dm874". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:55:17.179039 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.179018 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-util" (OuterVolumeSpecName: "util") pod "858417dc-cf03-4fe2-a342-9cbc1d94f665" (UID: "858417dc-cf03-4fe2-a342-9cbc1d94f665"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:55:17.276761 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.276675 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dm874\" (UniqueName: \"kubernetes.io/projected/858417dc-cf03-4fe2-a342-9cbc1d94f665-kube-api-access-dm874\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:55:17.276761 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.276704 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:55:17.276761 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:17.276715 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/858417dc-cf03-4fe2-a342-9cbc1d94f665-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:55:18.007442 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:18.007408 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" event={"ID":"858417dc-cf03-4fe2-a342-9cbc1d94f665","Type":"ContainerDied","Data":"ddb02642b76930b52eab3f2f8d8ebd563d88e391b51a7e158e5fbbfea8e58567"} Apr 20 17:55:18.007442 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:18.007444 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddb02642b76930b52eab3f2f8d8ebd563d88e391b51a7e158e5fbbfea8e58567" Apr 20 17:55:18.007442 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:18.007415 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dcr5kc" Apr 20 17:55:21.194553 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.194515 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp"] Apr 20 17:55:21.195047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.194838 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerName="util" Apr 20 17:55:21.195047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.194850 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerName="util" Apr 20 17:55:21.195047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.194868 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerName="pull" Apr 20 17:55:21.195047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.194873 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerName="pull" Apr 20 17:55:21.195047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.194880 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerName="extract" Apr 20 17:55:21.195047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.194886 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerName="extract" Apr 20 17:55:21.195047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.194936 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="858417dc-cf03-4fe2-a342-9cbc1d94f665" containerName="extract" Apr 20 17:55:21.201961 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.201941 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" Apr 20 17:55:21.205644 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.205623 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:55:21.205761 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.205698 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 17:55:21.206068 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.206052 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-fcv45\"" Apr 20 17:55:21.211070 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.211046 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp"] Apr 20 17:55:21.310141 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.310100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03918478-a355-40cc-b2f8-0cf25c49c255-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-4rsfp\" (UID: \"03918478-a355-40cc-b2f8-0cf25c49c255\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" Apr 20 17:55:21.310316 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.310151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzj7k\" (UniqueName: \"kubernetes.io/projected/03918478-a355-40cc-b2f8-0cf25c49c255-kube-api-access-xzj7k\") pod \"cert-manager-operator-controller-manager-54b9655956-4rsfp\" (UID: \"03918478-a355-40cc-b2f8-0cf25c49c255\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" Apr 20 17:55:21.410747 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.410712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03918478-a355-40cc-b2f8-0cf25c49c255-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-4rsfp\" (UID: \"03918478-a355-40cc-b2f8-0cf25c49c255\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" Apr 20 17:55:21.410893 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.410764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzj7k\" (UniqueName: \"kubernetes.io/projected/03918478-a355-40cc-b2f8-0cf25c49c255-kube-api-access-xzj7k\") pod \"cert-manager-operator-controller-manager-54b9655956-4rsfp\" (UID: \"03918478-a355-40cc-b2f8-0cf25c49c255\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" Apr 20 17:55:21.411190 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.411170 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03918478-a355-40cc-b2f8-0cf25c49c255-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-4rsfp\" (UID: \"03918478-a355-40cc-b2f8-0cf25c49c255\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" Apr 20 17:55:21.419229 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.419206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzj7k\" (UniqueName: \"kubernetes.io/projected/03918478-a355-40cc-b2f8-0cf25c49c255-kube-api-access-xzj7k\") pod \"cert-manager-operator-controller-manager-54b9655956-4rsfp\" (UID: \"03918478-a355-40cc-b2f8-0cf25c49c255\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" Apr 20 17:55:21.512039 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.512011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" Apr 20 17:55:21.636325 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:21.636290 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp"] Apr 20 17:55:21.640548 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:55:21.640516 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03918478_a355_40cc_b2f8_0cf25c49c255.slice/crio-02c29fd20b2f7f46fa8e68e98aac3d6d8d7ab27dc9c7beb57b02e71bda0d50dc WatchSource:0}: Error finding container 02c29fd20b2f7f46fa8e68e98aac3d6d8d7ab27dc9c7beb57b02e71bda0d50dc: Status 404 returned error can't find the container with id 02c29fd20b2f7f46fa8e68e98aac3d6d8d7ab27dc9c7beb57b02e71bda0d50dc Apr 20 17:55:22.019564 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:22.019529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" event={"ID":"03918478-a355-40cc-b2f8-0cf25c49c255","Type":"ContainerStarted","Data":"02c29fd20b2f7f46fa8e68e98aac3d6d8d7ab27dc9c7beb57b02e71bda0d50dc"} Apr 20 17:55:24.028270 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:24.028232 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" event={"ID":"03918478-a355-40cc-b2f8-0cf25c49c255","Type":"ContainerStarted","Data":"17c6bdff68ac83552b8eb0497fb066b8f65dcd684d9d564745204870e0c9f279"} Apr 20 17:55:24.051687 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:24.051622 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4rsfp" podStartSLOduration=1.557252074 podStartE2EDuration="3.051606533s" podCreationTimestamp="2026-04-20 17:55:21 +0000 UTC" firstStartedPulling="2026-04-20 17:55:21.643220022 +0000 UTC m=+423.428575253" lastFinishedPulling="2026-04-20 17:55:23.137574483 +0000 UTC m=+424.922929712" observedRunningTime="2026-04-20 17:55:24.04890608 +0000 UTC m=+425.834261318" watchObservedRunningTime="2026-04-20 17:55:24.051606533 +0000 UTC m=+425.836961772" Apr 20 17:55:25.442387 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.442343 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx"] Apr 20 17:55:25.446059 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.446034 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.448514 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.448490 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ds9v2\"" Apr 20 17:55:25.448775 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.448751 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 17:55:25.449581 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.449562 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 17:55:25.457202 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.457175 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx"] Apr 20 17:55:25.549912 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.549882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.550122 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.549929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.550122 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.549951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mcq2\" (UniqueName: \"kubernetes.io/projected/d53e1710-929d-44db-8e9f-d8ab6374a7a0-kube-api-access-4mcq2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.651203 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.651167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.651383 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.651227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.651383 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.651248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mcq2\" (UniqueName: \"kubernetes.io/projected/d53e1710-929d-44db-8e9f-d8ab6374a7a0-kube-api-access-4mcq2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.651550 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.651529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.651602 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.651586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.662084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.662052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mcq2\" (UniqueName: \"kubernetes.io/projected/d53e1710-929d-44db-8e9f-d8ab6374a7a0-kube-api-access-4mcq2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.758346 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.758311 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:25.864604 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.864570 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-c87ln"] Apr 20 17:55:25.870051 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.869296 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:25.871901 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.871873 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 17:55:25.871901 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.871875 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-fn9w9\"" Apr 20 17:55:25.872124 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.871980 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 17:55:25.877828 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.877802 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-c87ln"] Apr 20 17:55:25.883219 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.883199 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx"] Apr 20 17:55:25.888233 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:55:25.888207 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd53e1710_929d_44db_8e9f_d8ab6374a7a0.slice/crio-42bbd70afd8afcb779ff1227157596a366d380dfcee18437142bdaa8bc4cdffc WatchSource:0}: Error finding container 42bbd70afd8afcb779ff1227157596a366d380dfcee18437142bdaa8bc4cdffc: Status 404 returned error can't find the container with id 42bbd70afd8afcb779ff1227157596a366d380dfcee18437142bdaa8bc4cdffc Apr 20 17:55:25.954054 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.954027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v87kn\" (UniqueName: \"kubernetes.io/projected/3d2097c2-bf7d-4c66-b403-eb681f517e53-kube-api-access-v87kn\") pod \"cert-manager-webhook-587ccfb98-c87ln\" (UID: \"3d2097c2-bf7d-4c66-b403-eb681f517e53\") " pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:25.954191 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:25.954104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d2097c2-bf7d-4c66-b403-eb681f517e53-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-c87ln\" (UID: \"3d2097c2-bf7d-4c66-b403-eb681f517e53\") " pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:26.036414 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:26.036330 2577 generic.go:358] "Generic (PLEG): container finished" podID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerID="f0333da571e76cbaf71b11cf7fb38be1173fa61fafb755ce944798f93ef60552" exitCode=0 Apr 20 17:55:26.036414 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:26.036406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" event={"ID":"d53e1710-929d-44db-8e9f-d8ab6374a7a0","Type":"ContainerDied","Data":"f0333da571e76cbaf71b11cf7fb38be1173fa61fafb755ce944798f93ef60552"} Apr 20 17:55:26.036598 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:26.036430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" event={"ID":"d53e1710-929d-44db-8e9f-d8ab6374a7a0","Type":"ContainerStarted","Data":"42bbd70afd8afcb779ff1227157596a366d380dfcee18437142bdaa8bc4cdffc"} Apr 20 17:55:26.055450 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:26.055424 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d2097c2-bf7d-4c66-b403-eb681f517e53-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-c87ln\" (UID: \"3d2097c2-bf7d-4c66-b403-eb681f517e53\") " pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:26.055573 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:26.055540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v87kn\" (UniqueName: \"kubernetes.io/projected/3d2097c2-bf7d-4c66-b403-eb681f517e53-kube-api-access-v87kn\") pod \"cert-manager-webhook-587ccfb98-c87ln\" (UID: \"3d2097c2-bf7d-4c66-b403-eb681f517e53\") " pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:26.063600 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:26.063572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v87kn\" (UniqueName: \"kubernetes.io/projected/3d2097c2-bf7d-4c66-b403-eb681f517e53-kube-api-access-v87kn\") pod \"cert-manager-webhook-587ccfb98-c87ln\" (UID: \"3d2097c2-bf7d-4c66-b403-eb681f517e53\") " pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:26.064035 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:26.064018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d2097c2-bf7d-4c66-b403-eb681f517e53-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-c87ln\" (UID: \"3d2097c2-bf7d-4c66-b403-eb681f517e53\") " pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:26.189566 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:26.189533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:26.307238 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:26.307147 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-c87ln"] Apr 20 17:55:26.309814 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:55:26.309784 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d2097c2_bf7d_4c66_b403_eb681f517e53.slice/crio-da8fc82a6bb2dbb4d4d943e1c5d0595c3eb37e3c128fa7008ccb6b9d070f936f WatchSource:0}: Error finding container da8fc82a6bb2dbb4d4d943e1c5d0595c3eb37e3c128fa7008ccb6b9d070f936f: Status 404 returned error can't find the container with id da8fc82a6bb2dbb4d4d943e1c5d0595c3eb37e3c128fa7008ccb6b9d070f936f Apr 20 17:55:27.046367 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:27.046305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" event={"ID":"3d2097c2-bf7d-4c66-b403-eb681f517e53","Type":"ContainerStarted","Data":"da8fc82a6bb2dbb4d4d943e1c5d0595c3eb37e3c128fa7008ccb6b9d070f936f"} Apr 20 17:55:29.026854 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.026823 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-95jmm"] Apr 20 17:55:29.030191 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.030174 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" Apr 20 17:55:29.032516 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.032498 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-4h4kh\"" Apr 20 17:55:29.038614 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.038594 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-95jmm"] Apr 20 17:55:29.083421 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.083307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/153b4850-bb1c-4e09-8303-ba8218889ec0-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-95jmm\" (UID: \"153b4850-bb1c-4e09-8303-ba8218889ec0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" Apr 20 17:55:29.083421 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.083406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5vks\" (UniqueName: \"kubernetes.io/projected/153b4850-bb1c-4e09-8303-ba8218889ec0-kube-api-access-q5vks\") pod \"cert-manager-cainjector-68b757865b-95jmm\" (UID: \"153b4850-bb1c-4e09-8303-ba8218889ec0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" Apr 20 17:55:29.185205 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.185181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/153b4850-bb1c-4e09-8303-ba8218889ec0-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-95jmm\" (UID: \"153b4850-bb1c-4e09-8303-ba8218889ec0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" Apr 20 17:55:29.185324 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.185241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5vks\" (UniqueName: \"kubernetes.io/projected/153b4850-bb1c-4e09-8303-ba8218889ec0-kube-api-access-q5vks\") pod \"cert-manager-cainjector-68b757865b-95jmm\" (UID: \"153b4850-bb1c-4e09-8303-ba8218889ec0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" Apr 20 17:55:29.194101 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.194079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/153b4850-bb1c-4e09-8303-ba8218889ec0-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-95jmm\" (UID: \"153b4850-bb1c-4e09-8303-ba8218889ec0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" Apr 20 17:55:29.194253 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.194236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5vks\" (UniqueName: \"kubernetes.io/projected/153b4850-bb1c-4e09-8303-ba8218889ec0-kube-api-access-q5vks\") pod \"cert-manager-cainjector-68b757865b-95jmm\" (UID: \"153b4850-bb1c-4e09-8303-ba8218889ec0\") " pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" Apr 20 17:55:29.341336 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.341249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" Apr 20 17:55:29.474680 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:29.474656 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-95jmm"] Apr 20 17:55:29.477213 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:55:29.477183 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod153b4850_bb1c_4e09_8303_ba8218889ec0.slice/crio-65dd9bf28b481a05eadfb0a06997bd335836aff13985c015b5792af3d763e320 WatchSource:0}: Error finding container 65dd9bf28b481a05eadfb0a06997bd335836aff13985c015b5792af3d763e320: Status 404 returned error can't find the container with id 65dd9bf28b481a05eadfb0a06997bd335836aff13985c015b5792af3d763e320 Apr 20 17:55:30.060353 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:30.060315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" event={"ID":"153b4850-bb1c-4e09-8303-ba8218889ec0","Type":"ContainerStarted","Data":"4c6ccf662bf2add9cf155a08f6eb13925b11bd65b453e24f8594a8e5beabb56f"} Apr 20 17:55:30.060353 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:30.060355 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" event={"ID":"153b4850-bb1c-4e09-8303-ba8218889ec0","Type":"ContainerStarted","Data":"65dd9bf28b481a05eadfb0a06997bd335836aff13985c015b5792af3d763e320"} Apr 20 17:55:30.061652 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:30.061625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" event={"ID":"3d2097c2-bf7d-4c66-b403-eb681f517e53","Type":"ContainerStarted","Data":"7f244829248b3ebdef408a5d6702d0b590e69b5c26bd13d18dbd8083e3374342"} Apr 20 17:55:30.061772 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:30.061757 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:30.063140 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:30.063118 2577 generic.go:358] "Generic (PLEG): container finished" podID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerID="11899abd96704092718b8988cb94f2c1c4c4e864599ac485ab416b6d28d744e0" exitCode=0 Apr 20 17:55:30.063223 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:30.063156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" event={"ID":"d53e1710-929d-44db-8e9f-d8ab6374a7a0","Type":"ContainerDied","Data":"11899abd96704092718b8988cb94f2c1c4c4e864599ac485ab416b6d28d744e0"} Apr 20 17:55:30.079251 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:30.079205 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-95jmm" podStartSLOduration=1.079189761 podStartE2EDuration="1.079189761s" podCreationTimestamp="2026-04-20 17:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:55:30.0781788 +0000 UTC m=+431.863534085" watchObservedRunningTime="2026-04-20 17:55:30.079189761 +0000 UTC m=+431.864545001" Apr 20 17:55:30.116047 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:30.115404 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" podStartSLOduration=2.392955433 podStartE2EDuration="5.115385448s" podCreationTimestamp="2026-04-20 17:55:25 +0000 UTC" firstStartedPulling="2026-04-20 17:55:26.311720247 +0000 UTC m=+428.097075465" lastFinishedPulling="2026-04-20 17:55:29.034150264 +0000 UTC m=+430.819505480" observedRunningTime="2026-04-20 17:55:30.113808808 +0000 UTC m=+431.899164047" watchObservedRunningTime="2026-04-20 17:55:30.115385448 +0000 UTC m=+431.900740688" Apr 20 17:55:31.068761 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:31.068722 2577 generic.go:358] "Generic (PLEG): container finished" podID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerID="f124454ce38b81faa7f203afb8f2c005c3e3dcc789abbc54a1a88c5daf161729" exitCode=0 Apr 20 17:55:31.069214 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:31.068810 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" event={"ID":"d53e1710-929d-44db-8e9f-d8ab6374a7a0","Type":"ContainerDied","Data":"f124454ce38b81faa7f203afb8f2c005c3e3dcc789abbc54a1a88c5daf161729"} Apr 20 17:55:32.198003 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.197967 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:32.312384 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.312294 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mcq2\" (UniqueName: \"kubernetes.io/projected/d53e1710-929d-44db-8e9f-d8ab6374a7a0-kube-api-access-4mcq2\") pod \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " Apr 20 17:55:32.312384 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.312367 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-bundle\") pod \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " Apr 20 17:55:32.312607 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.312411 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-util\") pod \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\" (UID: \"d53e1710-929d-44db-8e9f-d8ab6374a7a0\") " Apr 20 17:55:32.312800 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.312764 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-bundle" (OuterVolumeSpecName: "bundle") pod "d53e1710-929d-44db-8e9f-d8ab6374a7a0" (UID: "d53e1710-929d-44db-8e9f-d8ab6374a7a0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:55:32.314408 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.314381 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d53e1710-929d-44db-8e9f-d8ab6374a7a0-kube-api-access-4mcq2" (OuterVolumeSpecName: "kube-api-access-4mcq2") pod "d53e1710-929d-44db-8e9f-d8ab6374a7a0" (UID: "d53e1710-929d-44db-8e9f-d8ab6374a7a0"). InnerVolumeSpecName "kube-api-access-4mcq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:55:32.317656 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.317632 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-util" (OuterVolumeSpecName: "util") pod "d53e1710-929d-44db-8e9f-d8ab6374a7a0" (UID: "d53e1710-929d-44db-8e9f-d8ab6374a7a0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:55:32.413170 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.413079 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mcq2\" (UniqueName: \"kubernetes.io/projected/d53e1710-929d-44db-8e9f-d8ab6374a7a0-kube-api-access-4mcq2\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:55:32.413170 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.413112 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:55:32.413170 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:32.413126 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d53e1710-929d-44db-8e9f-d8ab6374a7a0-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:55:33.077573 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:33.077488 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" event={"ID":"d53e1710-929d-44db-8e9f-d8ab6374a7a0","Type":"ContainerDied","Data":"42bbd70afd8afcb779ff1227157596a366d380dfcee18437142bdaa8bc4cdffc"} Apr 20 17:55:33.077573 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:33.077524 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42bbd70afd8afcb779ff1227157596a366d380dfcee18437142bdaa8bc4cdffc" Apr 20 17:55:33.077573 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:33.077532 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7zpzx" Apr 20 17:55:36.071917 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:36.071879 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-c87ln" Apr 20 17:55:45.380437 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.380406 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-b2l5s"] Apr 20 17:55:45.380883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.380728 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerName="pull" Apr 20 17:55:45.380883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.380740 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerName="pull" Apr 20 17:55:45.380883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.380750 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerName="extract" Apr 20 17:55:45.380883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.380755 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerName="extract" Apr 20 17:55:45.380883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.380764 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerName="util" Apr 20 17:55:45.380883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.380770 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerName="util" Apr 20 17:55:45.380883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.380847 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d53e1710-929d-44db-8e9f-d8ab6374a7a0" containerName="extract" Apr 20 17:55:45.392291 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.392261 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-b2l5s" Apr 20 17:55:45.393459 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.393435 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-b2l5s"] Apr 20 17:55:45.394679 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.394662 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-szccr\"" Apr 20 17:55:45.526319 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.526283 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd77889e-9211-4338-9f1f-9ed4b3c62bf1-bound-sa-token\") pod \"cert-manager-79c8d999ff-b2l5s\" (UID: \"bd77889e-9211-4338-9f1f-9ed4b3c62bf1\") " pod="cert-manager/cert-manager-79c8d999ff-b2l5s" Apr 20 17:55:45.526319 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.526319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fx9d\" (UniqueName: \"kubernetes.io/projected/bd77889e-9211-4338-9f1f-9ed4b3c62bf1-kube-api-access-8fx9d\") pod \"cert-manager-79c8d999ff-b2l5s\" (UID: \"bd77889e-9211-4338-9f1f-9ed4b3c62bf1\") " pod="cert-manager/cert-manager-79c8d999ff-b2l5s" Apr 20 17:55:45.627319 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.627286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd77889e-9211-4338-9f1f-9ed4b3c62bf1-bound-sa-token\") pod \"cert-manager-79c8d999ff-b2l5s\" (UID: \"bd77889e-9211-4338-9f1f-9ed4b3c62bf1\") " pod="cert-manager/cert-manager-79c8d999ff-b2l5s" Apr 20 17:55:45.627319 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.627319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fx9d\" (UniqueName: \"kubernetes.io/projected/bd77889e-9211-4338-9f1f-9ed4b3c62bf1-kube-api-access-8fx9d\") pod \"cert-manager-79c8d999ff-b2l5s\" (UID: \"bd77889e-9211-4338-9f1f-9ed4b3c62bf1\") " pod="cert-manager/cert-manager-79c8d999ff-b2l5s" Apr 20 17:55:45.634976 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.634917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd77889e-9211-4338-9f1f-9ed4b3c62bf1-bound-sa-token\") pod \"cert-manager-79c8d999ff-b2l5s\" (UID: \"bd77889e-9211-4338-9f1f-9ed4b3c62bf1\") " pod="cert-manager/cert-manager-79c8d999ff-b2l5s" Apr 20 17:55:45.635100 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.635074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fx9d\" (UniqueName: \"kubernetes.io/projected/bd77889e-9211-4338-9f1f-9ed4b3c62bf1-kube-api-access-8fx9d\") pod \"cert-manager-79c8d999ff-b2l5s\" (UID: \"bd77889e-9211-4338-9f1f-9ed4b3c62bf1\") " pod="cert-manager/cert-manager-79c8d999ff-b2l5s" Apr 20 17:55:45.702400 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.702357 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-b2l5s" Apr 20 17:55:45.823627 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:45.823594 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-b2l5s"] Apr 20 17:55:45.826632 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:55:45.826597 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd77889e_9211_4338_9f1f_9ed4b3c62bf1.slice/crio-89ce7dc35a3d83ef901114e8fda2c9d062fe9ac49984d8b6f0e7df92bd41afda WatchSource:0}: Error finding container 89ce7dc35a3d83ef901114e8fda2c9d062fe9ac49984d8b6f0e7df92bd41afda: Status 404 returned error can't find the container with id 89ce7dc35a3d83ef901114e8fda2c9d062fe9ac49984d8b6f0e7df92bd41afda Apr 20 17:55:46.124200 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:46.124160 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-b2l5s" event={"ID":"bd77889e-9211-4338-9f1f-9ed4b3c62bf1","Type":"ContainerStarted","Data":"b74a8800a5f8f95164a4c37cfad8ec67223057084d2f7b1463b6ca0c3095e8b7"} Apr 20 17:55:46.124200 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:46.124205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-b2l5s" event={"ID":"bd77889e-9211-4338-9f1f-9ed4b3c62bf1","Type":"ContainerStarted","Data":"89ce7dc35a3d83ef901114e8fda2c9d062fe9ac49984d8b6f0e7df92bd41afda"} Apr 20 17:55:46.139979 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:46.139935 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-b2l5s" podStartSLOduration=1.139921747 podStartE2EDuration="1.139921747s" podCreationTimestamp="2026-04-20 17:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:55:46.138517531 +0000 UTC m=+447.923872767" watchObservedRunningTime="2026-04-20 17:55:46.139921747 +0000 UTC m=+447.925276986" Apr 20 17:55:49.223944 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.223913 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj"] Apr 20 17:55:49.227517 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.227502 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.229942 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.229919 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 17:55:49.230084 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.229943 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 17:55:49.230849 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.230831 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ds9v2\"" Apr 20 17:55:49.235885 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.235859 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj"] Apr 20 17:55:49.357509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.357483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.357668 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.357517 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8f8w\" (UniqueName: \"kubernetes.io/projected/d3ff8839-5cc3-418c-9ad9-daf13abe0820-kube-api-access-g8f8w\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.357668 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.357620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.458867 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.458833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.459066 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.458871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8f8w\" (UniqueName: \"kubernetes.io/projected/d3ff8839-5cc3-418c-9ad9-daf13abe0820-kube-api-access-g8f8w\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.459066 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.458923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.459327 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.459299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.459406 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.459321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.466805 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.466784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8f8w\" (UniqueName: \"kubernetes.io/projected/d3ff8839-5cc3-418c-9ad9-daf13abe0820-kube-api-access-g8f8w\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.537789 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.537721 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:49.661363 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:49.661282 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj"] Apr 20 17:55:49.664095 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:55:49.664070 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3ff8839_5cc3_418c_9ad9_daf13abe0820.slice/crio-4caa9f20f2784d1b8579306decc4cbde53c2adfd247324c64d8ec37d9366115a WatchSource:0}: Error finding container 4caa9f20f2784d1b8579306decc4cbde53c2adfd247324c64d8ec37d9366115a: Status 404 returned error can't find the container with id 4caa9f20f2784d1b8579306decc4cbde53c2adfd247324c64d8ec37d9366115a Apr 20 17:55:50.138787 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:50.138752 2577 generic.go:358] "Generic (PLEG): container finished" podID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerID="4154749bfe265e7acbc7b4c3c9d56c69cc9b611a2cef1402ccc79086f407c5de" exitCode=0 Apr 20 17:55:50.138958 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:50.138841 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" event={"ID":"d3ff8839-5cc3-418c-9ad9-daf13abe0820","Type":"ContainerDied","Data":"4154749bfe265e7acbc7b4c3c9d56c69cc9b611a2cef1402ccc79086f407c5de"} Apr 20 17:55:50.138958 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:50.138878 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" event={"ID":"d3ff8839-5cc3-418c-9ad9-daf13abe0820","Type":"ContainerStarted","Data":"4caa9f20f2784d1b8579306decc4cbde53c2adfd247324c64d8ec37d9366115a"} Apr 20 17:55:51.144185 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:51.144095 2577 generic.go:358] "Generic (PLEG): container finished" podID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerID="24e2580596168ce4415cb6052e6dd1b683aa84d524172cdba66ea72882c1c722" exitCode=0 Apr 20 17:55:51.144610 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:51.144188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" event={"ID":"d3ff8839-5cc3-418c-9ad9-daf13abe0820","Type":"ContainerDied","Data":"24e2580596168ce4415cb6052e6dd1b683aa84d524172cdba66ea72882c1c722"} Apr 20 17:55:52.149702 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:52.149667 2577 generic.go:358] "Generic (PLEG): container finished" podID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerID="7fab76b33523e4573f2b63fc7546f3ba6995732c0acb0392c87fc7e8945e27e0" exitCode=0 Apr 20 17:55:52.150155 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:52.149742 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" event={"ID":"d3ff8839-5cc3-418c-9ad9-daf13abe0820","Type":"ContainerDied","Data":"7fab76b33523e4573f2b63fc7546f3ba6995732c0acb0392c87fc7e8945e27e0"} Apr 20 17:55:53.280227 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.280206 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:53.398376 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.398343 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-bundle\") pod \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " Apr 20 17:55:53.398549 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.398435 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8f8w\" (UniqueName: \"kubernetes.io/projected/d3ff8839-5cc3-418c-9ad9-daf13abe0820-kube-api-access-g8f8w\") pod \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " Apr 20 17:55:53.398549 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.398487 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-util\") pod \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\" (UID: \"d3ff8839-5cc3-418c-9ad9-daf13abe0820\") " Apr 20 17:55:53.399100 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.399070 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-bundle" (OuterVolumeSpecName: "bundle") pod "d3ff8839-5cc3-418c-9ad9-daf13abe0820" (UID: "d3ff8839-5cc3-418c-9ad9-daf13abe0820"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:55:53.400532 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.400503 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ff8839-5cc3-418c-9ad9-daf13abe0820-kube-api-access-g8f8w" (OuterVolumeSpecName: "kube-api-access-g8f8w") pod "d3ff8839-5cc3-418c-9ad9-daf13abe0820" (UID: "d3ff8839-5cc3-418c-9ad9-daf13abe0820"). InnerVolumeSpecName "kube-api-access-g8f8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:55:53.404247 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.404199 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-util" (OuterVolumeSpecName: "util") pod "d3ff8839-5cc3-418c-9ad9-daf13abe0820" (UID: "d3ff8839-5cc3-418c-9ad9-daf13abe0820"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:55:53.499182 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.499146 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g8f8w\" (UniqueName: \"kubernetes.io/projected/d3ff8839-5cc3-418c-9ad9-daf13abe0820-kube-api-access-g8f8w\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:55:53.499182 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.499176 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:55:53.499182 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:53.499189 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3ff8839-5cc3-418c-9ad9-daf13abe0820-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:55:54.158658 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:54.158625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" event={"ID":"d3ff8839-5cc3-418c-9ad9-daf13abe0820","Type":"ContainerDied","Data":"4caa9f20f2784d1b8579306decc4cbde53c2adfd247324c64d8ec37d9366115a"} Apr 20 17:55:54.158658 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:54.158656 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4caa9f20f2784d1b8579306decc4cbde53c2adfd247324c64d8ec37d9366115a" Apr 20 17:55:54.158858 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:54.158675 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ts9wj" Apr 20 17:55:59.826512 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.826474 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27"] Apr 20 17:55:59.826883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.826808 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerName="pull" Apr 20 17:55:59.826883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.826820 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerName="pull" Apr 20 17:55:59.826883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.826832 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerName="extract" Apr 20 17:55:59.826883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.826838 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerName="extract" Apr 20 17:55:59.826883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.826855 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerName="util" Apr 20 17:55:59.826883 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.826861 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerName="util" Apr 20 17:55:59.827134 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.826909 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3ff8839-5cc3-418c-9ad9-daf13abe0820" containerName="extract" Apr 20 17:55:59.831152 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.831133 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:55:59.833580 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.833552 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 17:55:59.834507 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.834489 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 17:55:59.834637 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.834538 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ds9v2\"" Apr 20 17:55:59.841677 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.841653 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27"] Apr 20 17:55:59.852885 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.852863 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:55:59.852975 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.852902 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:55:59.853058 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.853004 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqt98\" (UniqueName: \"kubernetes.io/projected/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-kube-api-access-rqt98\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:55:59.953511 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.953475 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:55:59.953659 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.953523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:55:59.953659 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.953613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqt98\" (UniqueName: \"kubernetes.io/projected/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-kube-api-access-rqt98\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:55:59.953850 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.953830 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:55:59.953927 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.953908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:55:59.968677 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:55:59.968654 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqt98\" (UniqueName: \"kubernetes.io/projected/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-kube-api-access-rqt98\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:56:00.140543 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:00.140442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:56:00.280110 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:00.280087 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27"] Apr 20 17:56:00.282964 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:56:00.282939 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac6a081_f22a_4c1b_a95d_d5f3ae2851ba.slice/crio-db65a26032bc30dddb5f9a48860908c5a725121e31f2c06a23c3fa9d7536df3d WatchSource:0}: Error finding container db65a26032bc30dddb5f9a48860908c5a725121e31f2c06a23c3fa9d7536df3d: Status 404 returned error can't find the container with id db65a26032bc30dddb5f9a48860908c5a725121e31f2c06a23c3fa9d7536df3d Apr 20 17:56:01.184861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.184819 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerID="80351f64ee3c405f4878ffc5035130f191bb8ede01f28a321f129a866b5e377b" exitCode=0 Apr 20 17:56:01.184861 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.184863 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" event={"ID":"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba","Type":"ContainerDied","Data":"80351f64ee3c405f4878ffc5035130f191bb8ede01f28a321f129a866b5e377b"} Apr 20 17:56:01.185356 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.184887 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" event={"ID":"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba","Type":"ContainerStarted","Data":"db65a26032bc30dddb5f9a48860908c5a725121e31f2c06a23c3fa9d7536df3d"} Apr 20 17:56:01.816423 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.816395 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7"] Apr 20 17:56:01.820832 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.820814 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:01.823590 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.823569 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-sc5qn\"" Apr 20 17:56:01.824273 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.824257 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 17:56:01.824375 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.824322 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 17:56:01.824490 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.824476 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 17:56:01.824549 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.824477 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 17:56:01.832942 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.832921 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7"] Apr 20 17:56:01.865294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.865263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d761368-405c-4e31-ab43-47d4afe6b6e2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-lhhn7\" (UID: \"2d761368-405c-4e31-ab43-47d4afe6b6e2\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:01.865423 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.865309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79tjq\" (UniqueName: \"kubernetes.io/projected/2d761368-405c-4e31-ab43-47d4afe6b6e2-kube-api-access-79tjq\") pod \"opendatahub-operator-controller-manager-b8c4c7886-lhhn7\" (UID: \"2d761368-405c-4e31-ab43-47d4afe6b6e2\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:01.865423 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.865328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d761368-405c-4e31-ab43-47d4afe6b6e2-webhook-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-lhhn7\" (UID: \"2d761368-405c-4e31-ab43-47d4afe6b6e2\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:01.965858 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.965835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d761368-405c-4e31-ab43-47d4afe6b6e2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-lhhn7\" (UID: \"2d761368-405c-4e31-ab43-47d4afe6b6e2\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:01.965950 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.965883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79tjq\" (UniqueName: \"kubernetes.io/projected/2d761368-405c-4e31-ab43-47d4afe6b6e2-kube-api-access-79tjq\") pod \"opendatahub-operator-controller-manager-b8c4c7886-lhhn7\" (UID: \"2d761368-405c-4e31-ab43-47d4afe6b6e2\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:01.965950 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.965905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d761368-405c-4e31-ab43-47d4afe6b6e2-webhook-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-lhhn7\" (UID: \"2d761368-405c-4e31-ab43-47d4afe6b6e2\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:01.968025 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.968003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d761368-405c-4e31-ab43-47d4afe6b6e2-webhook-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-lhhn7\" (UID: \"2d761368-405c-4e31-ab43-47d4afe6b6e2\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:01.968112 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.968038 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d761368-405c-4e31-ab43-47d4afe6b6e2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-lhhn7\" (UID: \"2d761368-405c-4e31-ab43-47d4afe6b6e2\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:01.975151 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:01.975131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79tjq\" (UniqueName: \"kubernetes.io/projected/2d761368-405c-4e31-ab43-47d4afe6b6e2-kube-api-access-79tjq\") pod \"opendatahub-operator-controller-manager-b8c4c7886-lhhn7\" (UID: \"2d761368-405c-4e31-ab43-47d4afe6b6e2\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:02.132671 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:02.132581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:02.193041 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:02.192973 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerID="97aa11fc35d2e7d66b4bc0484c3043ed541e88836639c7e4033e0d9f42ad804a" exitCode=0 Apr 20 17:56:02.193375 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:02.193127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" event={"ID":"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba","Type":"ContainerDied","Data":"97aa11fc35d2e7d66b4bc0484c3043ed541e88836639c7e4033e0d9f42ad804a"} Apr 20 17:56:02.296692 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:02.296666 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7"] Apr 20 17:56:02.299246 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:56:02.299219 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d761368_405c_4e31_ab43_47d4afe6b6e2.slice/crio-fd9935e311d35d0939dd8e33383b69c835a0d89419f51939f81fa4e0aacd99ce WatchSource:0}: Error finding container fd9935e311d35d0939dd8e33383b69c835a0d89419f51939f81fa4e0aacd99ce: Status 404 returned error can't find the container with id fd9935e311d35d0939dd8e33383b69c835a0d89419f51939f81fa4e0aacd99ce Apr 20 17:56:03.199001 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:03.198935 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" event={"ID":"2d761368-405c-4e31-ab43-47d4afe6b6e2","Type":"ContainerStarted","Data":"fd9935e311d35d0939dd8e33383b69c835a0d89419f51939f81fa4e0aacd99ce"} Apr 20 17:56:03.201369 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:03.201336 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerID="60bed37d24d0c76707088378445d82f3dc7be44f6fce5dca1ec822f3494895b3" exitCode=0 Apr 20 17:56:03.201512 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:03.201388 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" event={"ID":"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba","Type":"ContainerDied","Data":"60bed37d24d0c76707088378445d82f3dc7be44f6fce5dca1ec822f3494895b3"} Apr 20 17:56:04.765384 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.765360 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:56:04.786853 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.786823 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqt98\" (UniqueName: \"kubernetes.io/projected/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-kube-api-access-rqt98\") pod \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " Apr 20 17:56:04.787008 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.786880 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-util\") pod \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " Apr 20 17:56:04.787008 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.786968 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-bundle\") pod \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\" (UID: \"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba\") " Apr 20 17:56:04.788409 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.788373 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-bundle" (OuterVolumeSpecName: "bundle") pod "2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" (UID: "2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:56:04.789199 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.789174 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-kube-api-access-rqt98" (OuterVolumeSpecName: "kube-api-access-rqt98") pod "2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" (UID: "2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba"). InnerVolumeSpecName "kube-api-access-rqt98". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:56:04.795369 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.795340 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-util" (OuterVolumeSpecName: "util") pod "2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" (UID: "2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:56:04.888352 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.888323 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqt98\" (UniqueName: \"kubernetes.io/projected/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-kube-api-access-rqt98\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:56:04.888352 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.888349 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:56:04.888352 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:04.888359 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:56:05.212592 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:05.212552 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" event={"ID":"2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba","Type":"ContainerDied","Data":"db65a26032bc30dddb5f9a48860908c5a725121e31f2c06a23c3fa9d7536df3d"} Apr 20 17:56:05.212592 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:05.212598 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db65a26032bc30dddb5f9a48860908c5a725121e31f2c06a23c3fa9d7536df3d" Apr 20 17:56:05.212851 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:05.212569 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9s2s27" Apr 20 17:56:05.213946 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:05.213920 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" event={"ID":"2d761368-405c-4e31-ab43-47d4afe6b6e2","Type":"ContainerStarted","Data":"18756b0cb46b97616366fd7a3dd057f61da2c1ba6c383ba1b0454a6488e2de7b"} Apr 20 17:56:05.214082 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:05.214030 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:05.232712 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:05.232672 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" podStartSLOduration=1.735576266 podStartE2EDuration="4.232660184s" podCreationTimestamp="2026-04-20 17:56:01 +0000 UTC" firstStartedPulling="2026-04-20 17:56:02.301569926 +0000 UTC m=+464.086925143" lastFinishedPulling="2026-04-20 17:56:04.798653833 +0000 UTC m=+466.584009061" observedRunningTime="2026-04-20 17:56:05.231718498 +0000 UTC m=+467.017073740" watchObservedRunningTime="2026-04-20 17:56:05.232660184 +0000 UTC m=+467.018015422" Apr 20 17:56:06.696915 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.696876 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6"] Apr 20 17:56:06.697399 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.697346 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerName="util" Apr 20 17:56:06.697399 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.697365 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerName="util" Apr 20 17:56:06.697399 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.697391 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerName="pull" Apr 20 17:56:06.697399 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.697399 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerName="pull" Apr 20 17:56:06.697597 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.697411 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerName="extract" Apr 20 17:56:06.697597 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.697421 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerName="extract" Apr 20 17:56:06.697597 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.697518 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ac6a081-f22a-4c1b-a95d-d5f3ae2851ba" containerName="extract" Apr 20 17:56:06.700837 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.700816 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.703232 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.703209 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 17:56:06.703305 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.703287 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 17:56:06.704185 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.704166 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:56:06.704294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.704186 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-sgp5t\"" Apr 20 17:56:06.704294 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.704198 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 17:56:06.704414 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.704350 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 17:56:06.712889 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.712869 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6"] Apr 20 17:56:06.803704 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.803676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrjtd\" (UniqueName: \"kubernetes.io/projected/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-kube-api-access-zrjtd\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.803829 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.803721 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-cert\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.803829 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.803783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-metrics-cert\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.803829 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.803825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-manager-config\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.905091 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.905049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrjtd\" (UniqueName: \"kubernetes.io/projected/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-kube-api-access-zrjtd\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.905290 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.905117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-cert\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.905290 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.905157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-metrics-cert\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.905290 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.905190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-manager-config\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.905796 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.905763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-manager-config\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.907640 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.907618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-metrics-cert\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.907735 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.907662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-cert\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:06.922248 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:06.922224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrjtd\" (UniqueName: \"kubernetes.io/projected/c78dcf0a-c37b-40b4-a0de-c5f4a69890e2-kube-api-access-zrjtd\") pod \"lws-controller-manager-7589d7b74d-nkcb6\" (UID: \"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:07.010771 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:07.010741 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:07.154399 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:07.154316 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6"] Apr 20 17:56:07.156716 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:56:07.156682 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc78dcf0a_c37b_40b4_a0de_c5f4a69890e2.slice/crio-67b9ae090ee13ae7ade382418073a4db483d1d721f940424c89235ad2dcf9856 WatchSource:0}: Error finding container 67b9ae090ee13ae7ade382418073a4db483d1d721f940424c89235ad2dcf9856: Status 404 returned error can't find the container with id 67b9ae090ee13ae7ade382418073a4db483d1d721f940424c89235ad2dcf9856 Apr 20 17:56:07.221968 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:07.221933 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" event={"ID":"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2","Type":"ContainerStarted","Data":"67b9ae090ee13ae7ade382418073a4db483d1d721f940424c89235ad2dcf9856"} Apr 20 17:56:09.231172 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:09.231130 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" event={"ID":"c78dcf0a-c37b-40b4-a0de-c5f4a69890e2","Type":"ContainerStarted","Data":"ba6fc607c21df8ef7ce6a412a80b04ea7471b93910bb41615023feb63b6a7657"} Apr 20 17:56:09.231540 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:09.231258 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:09.250090 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:09.250034 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" podStartSLOduration=1.7287897939999999 podStartE2EDuration="3.250018543s" podCreationTimestamp="2026-04-20 17:56:06 +0000 UTC" firstStartedPulling="2026-04-20 17:56:07.158509 +0000 UTC m=+468.943864217" lastFinishedPulling="2026-04-20 17:56:08.679737745 +0000 UTC m=+470.465092966" observedRunningTime="2026-04-20 17:56:09.249972692 +0000 UTC m=+471.035327954" watchObservedRunningTime="2026-04-20 17:56:09.250018543 +0000 UTC m=+471.035373783" Apr 20 17:56:16.219486 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:16.219454 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-lhhn7" Apr 20 17:56:18.439821 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.439788 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq"] Apr 20 17:56:18.443754 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.443736 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.446362 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.446323 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 17:56:18.446498 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.446452 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 17:56:18.447205 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.447188 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ds9v2\"" Apr 20 17:56:18.460814 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.460789 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq"] Apr 20 17:56:18.502977 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.502950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz54m\" (UniqueName: \"kubernetes.io/projected/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-kube-api-access-kz54m\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.503132 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.503021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.503132 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.503040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.604390 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.604348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz54m\" (UniqueName: \"kubernetes.io/projected/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-kube-api-access-kz54m\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.604557 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.604435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.604557 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.604462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.604835 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.604817 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.604908 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.604889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.612602 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.612582 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 17:56:18.623124 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.623104 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 17:56:18.633641 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.633617 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz54m\" (UniqueName: \"kubernetes.io/projected/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-kube-api-access-kz54m\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.755179 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.755153 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ds9v2\"" Apr 20 17:56:18.763081 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.763051 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:18.905188 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:18.905158 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq"] Apr 20 17:56:18.907064 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:56:18.907035 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4972f5e2_d346_4ff3_a5fb_df62635dfa2c.slice/crio-486d2f5fe0a76894bfb98dc397a92296c65acc03de6c70ce6170040d987f3d31 WatchSource:0}: Error finding container 486d2f5fe0a76894bfb98dc397a92296c65acc03de6c70ce6170040d987f3d31: Status 404 returned error can't find the container with id 486d2f5fe0a76894bfb98dc397a92296c65acc03de6c70ce6170040d987f3d31 Apr 20 17:56:19.008277 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.008202 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7"] Apr 20 17:56:19.011821 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.011798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.014182 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.014134 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-mfcz2\"" Apr 20 17:56:19.014584 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.014532 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 17:56:19.014742 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.014594 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 17:56:19.014742 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.014598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 17:56:19.014906 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.014893 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 17:56:19.023611 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.023585 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7"] Apr 20 17:56:19.109750 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.109718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxqr4\" (UniqueName: \"kubernetes.io/projected/57666fdc-66c3-46aa-b04d-f5251dea0b08-kube-api-access-hxqr4\") pod \"kube-auth-proxy-554dd5dd7d-qnpw7\" (UID: \"57666fdc-66c3-46aa-b04d-f5251dea0b08\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.109923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.109765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/57666fdc-66c3-46aa-b04d-f5251dea0b08-tmp\") pod \"kube-auth-proxy-554dd5dd7d-qnpw7\" (UID: \"57666fdc-66c3-46aa-b04d-f5251dea0b08\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.109923 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.109866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57666fdc-66c3-46aa-b04d-f5251dea0b08-tls-certs\") pod \"kube-auth-proxy-554dd5dd7d-qnpw7\" (UID: \"57666fdc-66c3-46aa-b04d-f5251dea0b08\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.210435 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.210400 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqr4\" (UniqueName: \"kubernetes.io/projected/57666fdc-66c3-46aa-b04d-f5251dea0b08-kube-api-access-hxqr4\") pod \"kube-auth-proxy-554dd5dd7d-qnpw7\" (UID: \"57666fdc-66c3-46aa-b04d-f5251dea0b08\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.210642 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.210454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/57666fdc-66c3-46aa-b04d-f5251dea0b08-tmp\") pod \"kube-auth-proxy-554dd5dd7d-qnpw7\" (UID: \"57666fdc-66c3-46aa-b04d-f5251dea0b08\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.210642 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.210522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57666fdc-66c3-46aa-b04d-f5251dea0b08-tls-certs\") pod \"kube-auth-proxy-554dd5dd7d-qnpw7\" (UID: \"57666fdc-66c3-46aa-b04d-f5251dea0b08\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.212697 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.212673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/57666fdc-66c3-46aa-b04d-f5251dea0b08-tmp\") pod \"kube-auth-proxy-554dd5dd7d-qnpw7\" (UID: \"57666fdc-66c3-46aa-b04d-f5251dea0b08\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.213011 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.212973 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57666fdc-66c3-46aa-b04d-f5251dea0b08-tls-certs\") pod \"kube-auth-proxy-554dd5dd7d-qnpw7\" (UID: \"57666fdc-66c3-46aa-b04d-f5251dea0b08\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.219498 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.219475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxqr4\" (UniqueName: \"kubernetes.io/projected/57666fdc-66c3-46aa-b04d-f5251dea0b08-kube-api-access-hxqr4\") pod \"kube-auth-proxy-554dd5dd7d-qnpw7\" (UID: \"57666fdc-66c3-46aa-b04d-f5251dea0b08\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.265312 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.265218 2577 generic.go:358] "Generic (PLEG): container finished" podID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerID="36b294e9a334490896425552e24e53c91dea63ec5974247fe27854ee657ed47a" exitCode=0 Apr 20 17:56:19.265312 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.265291 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" event={"ID":"4972f5e2-d346-4ff3-a5fb-df62635dfa2c","Type":"ContainerDied","Data":"36b294e9a334490896425552e24e53c91dea63ec5974247fe27854ee657ed47a"} Apr 20 17:56:19.265518 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.265322 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" event={"ID":"4972f5e2-d346-4ff3-a5fb-df62635dfa2c","Type":"ContainerStarted","Data":"486d2f5fe0a76894bfb98dc397a92296c65acc03de6c70ce6170040d987f3d31"} Apr 20 17:56:19.323558 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.323531 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" Apr 20 17:56:19.455246 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:19.455215 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7"] Apr 20 17:56:19.456327 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:56:19.456298 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57666fdc_66c3_46aa_b04d_f5251dea0b08.slice/crio-3ff233976d3b6fb5c152eccd205f2b9b3c243317b45001343ed015eb07c09f25 WatchSource:0}: Error finding container 3ff233976d3b6fb5c152eccd205f2b9b3c243317b45001343ed015eb07c09f25: Status 404 returned error can't find the container with id 3ff233976d3b6fb5c152eccd205f2b9b3c243317b45001343ed015eb07c09f25 Apr 20 17:56:20.238222 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:20.237588 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-nkcb6" Apr 20 17:56:20.273862 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:20.273827 2577 generic.go:358] "Generic (PLEG): container finished" podID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerID="2bc192d6aa5c15758d4bb3758a64533cfc19975cf27eb49aea3ca1c2f6b9263b" exitCode=0 Apr 20 17:56:20.274068 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:20.273923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" event={"ID":"4972f5e2-d346-4ff3-a5fb-df62635dfa2c","Type":"ContainerDied","Data":"2bc192d6aa5c15758d4bb3758a64533cfc19975cf27eb49aea3ca1c2f6b9263b"} Apr 20 17:56:20.278279 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:20.278259 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" event={"ID":"57666fdc-66c3-46aa-b04d-f5251dea0b08","Type":"ContainerStarted","Data":"3ff233976d3b6fb5c152eccd205f2b9b3c243317b45001343ed015eb07c09f25"} Apr 20 17:56:21.285457 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:21.285420 2577 generic.go:358] "Generic (PLEG): container finished" podID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerID="1929575644f8a28ad3b93aa3e05e83163a9c56f10a176b06c1efc6b1e9466370" exitCode=0 Apr 20 17:56:21.285913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:21.285563 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" event={"ID":"4972f5e2-d346-4ff3-a5fb-df62635dfa2c","Type":"ContainerDied","Data":"1929575644f8a28ad3b93aa3e05e83163a9c56f10a176b06c1efc6b1e9466370"} Apr 20 17:56:22.546054 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.545969 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:22.643439 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.642973 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-util\") pod \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " Apr 20 17:56:22.643439 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.643088 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-bundle\") pod \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " Apr 20 17:56:22.643439 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.643169 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz54m\" (UniqueName: \"kubernetes.io/projected/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-kube-api-access-kz54m\") pod \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\" (UID: \"4972f5e2-d346-4ff3-a5fb-df62635dfa2c\") " Apr 20 17:56:22.644241 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.644207 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-bundle" (OuterVolumeSpecName: "bundle") pod "4972f5e2-d346-4ff3-a5fb-df62635dfa2c" (UID: "4972f5e2-d346-4ff3-a5fb-df62635dfa2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:56:22.645467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.645433 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-kube-api-access-kz54m" (OuterVolumeSpecName: "kube-api-access-kz54m") pod "4972f5e2-d346-4ff3-a5fb-df62635dfa2c" (UID: "4972f5e2-d346-4ff3-a5fb-df62635dfa2c"). InnerVolumeSpecName "kube-api-access-kz54m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:56:22.651878 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.651847 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-util" (OuterVolumeSpecName: "util") pod "4972f5e2-d346-4ff3-a5fb-df62635dfa2c" (UID: "4972f5e2-d346-4ff3-a5fb-df62635dfa2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:56:22.744280 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.744254 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:56:22.744280 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.744279 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:56:22.744447 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:22.744308 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kz54m\" (UniqueName: \"kubernetes.io/projected/4972f5e2-d346-4ff3-a5fb-df62635dfa2c-kube-api-access-kz54m\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:56:23.294409 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:23.294378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" event={"ID":"4972f5e2-d346-4ff3-a5fb-df62635dfa2c","Type":"ContainerDied","Data":"486d2f5fe0a76894bfb98dc397a92296c65acc03de6c70ce6170040d987f3d31"} Apr 20 17:56:23.294409 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:23.294414 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="486d2f5fe0a76894bfb98dc397a92296c65acc03de6c70ce6170040d987f3d31" Apr 20 17:56:23.294606 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:23.294423 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t9wtq" Apr 20 17:56:23.295794 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:23.295768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" event={"ID":"57666fdc-66c3-46aa-b04d-f5251dea0b08","Type":"ContainerStarted","Data":"044c68d7d802fd3349d99a53546b557c036f3b7eda1f86d9aa7e82d9edc49a2f"} Apr 20 17:56:23.312338 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:23.312298 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-qnpw7" podStartSLOduration=2.187405912 podStartE2EDuration="5.312286304s" podCreationTimestamp="2026-04-20 17:56:18 +0000 UTC" firstStartedPulling="2026-04-20 17:56:19.45815741 +0000 UTC m=+481.243512628" lastFinishedPulling="2026-04-20 17:56:22.583037802 +0000 UTC m=+484.368393020" observedRunningTime="2026-04-20 17:56:23.311257163 +0000 UTC m=+485.096612393" watchObservedRunningTime="2026-04-20 17:56:23.312286304 +0000 UTC m=+485.097641543" Apr 20 17:56:32.881913 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.881880 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7"] Apr 20 17:56:32.882293 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.882225 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerName="util" Apr 20 17:56:32.882293 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.882237 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerName="util" Apr 20 17:56:32.882293 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.882247 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerName="extract" Apr 20 17:56:32.882293 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.882252 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerName="extract" Apr 20 17:56:32.882293 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.882271 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerName="pull" Apr 20 17:56:32.882293 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.882277 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerName="pull" Apr 20 17:56:32.882489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.882329 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4972f5e2-d346-4ff3-a5fb-df62635dfa2c" containerName="extract" Apr 20 17:56:32.887000 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.886973 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:32.890243 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.890222 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 17:56:32.890416 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.890404 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ds9v2\"" Apr 20 17:56:32.891516 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.891502 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 17:56:32.917723 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:32.917695 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7"] Apr 20 17:56:33.039075 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.039040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.039251 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.039117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.039251 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.039142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pgvq\" (UniqueName: \"kubernetes.io/projected/f723f7ce-e870-440e-82d3-b13d1d334569-kube-api-access-4pgvq\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.139771 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.139687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.139771 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.139754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.139956 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.139795 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pgvq\" (UniqueName: \"kubernetes.io/projected/f723f7ce-e870-440e-82d3-b13d1d334569-kube-api-access-4pgvq\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.140114 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.140094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.140224 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.140206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.160378 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.160344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pgvq\" (UniqueName: \"kubernetes.io/projected/f723f7ce-e870-440e-82d3-b13d1d334569-kube-api-access-4pgvq\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.196153 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.196119 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:33.362534 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:33.362508 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7"] Apr 20 17:56:33.365274 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:56:33.365233 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf723f7ce_e870_440e_82d3_b13d1d334569.slice/crio-c4503b8791e2005f3df0e48af0767669cdcb599cf5a77a11a296d5ed07013881 WatchSource:0}: Error finding container c4503b8791e2005f3df0e48af0767669cdcb599cf5a77a11a296d5ed07013881: Status 404 returned error can't find the container with id c4503b8791e2005f3df0e48af0767669cdcb599cf5a77a11a296d5ed07013881 Apr 20 17:56:34.352097 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:34.352062 2577 generic.go:358] "Generic (PLEG): container finished" podID="f723f7ce-e870-440e-82d3-b13d1d334569" containerID="9126fe2369febfe67a1dd8e54b3dcbf56cdf7b500e2a851b6e9e9fe89a98c08e" exitCode=0 Apr 20 17:56:34.352570 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:34.352133 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" event={"ID":"f723f7ce-e870-440e-82d3-b13d1d334569","Type":"ContainerDied","Data":"9126fe2369febfe67a1dd8e54b3dcbf56cdf7b500e2a851b6e9e9fe89a98c08e"} Apr 20 17:56:34.352570 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:34.352156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" event={"ID":"f723f7ce-e870-440e-82d3-b13d1d334569","Type":"ContainerStarted","Data":"c4503b8791e2005f3df0e48af0767669cdcb599cf5a77a11a296d5ed07013881"} Apr 20 17:56:36.364487 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:36.364451 2577 generic.go:358] "Generic (PLEG): container finished" podID="f723f7ce-e870-440e-82d3-b13d1d334569" containerID="71a579da60f0b437a18fe653e572f7d46db0ebeda0bf6f2a39069935e9116f05" exitCode=0 Apr 20 17:56:36.364881 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:36.364504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" event={"ID":"f723f7ce-e870-440e-82d3-b13d1d334569","Type":"ContainerDied","Data":"71a579da60f0b437a18fe653e572f7d46db0ebeda0bf6f2a39069935e9116f05"} Apr 20 17:56:37.370068 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:37.370017 2577 generic.go:358] "Generic (PLEG): container finished" podID="f723f7ce-e870-440e-82d3-b13d1d334569" containerID="2adee059b0ecbd4443e074b5f12a77e8dfff9d93422aa3ba916587c5ea6b9d52" exitCode=0 Apr 20 17:56:37.370458 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:37.370099 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" event={"ID":"f723f7ce-e870-440e-82d3-b13d1d334569","Type":"ContainerDied","Data":"2adee059b0ecbd4443e074b5f12a77e8dfff9d93422aa3ba916587c5ea6b9d52"} Apr 20 17:56:38.499184 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.499158 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:56:38.589744 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.589712 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-util\") pod \"f723f7ce-e870-440e-82d3-b13d1d334569\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " Apr 20 17:56:38.589924 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.589795 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pgvq\" (UniqueName: \"kubernetes.io/projected/f723f7ce-e870-440e-82d3-b13d1d334569-kube-api-access-4pgvq\") pod \"f723f7ce-e870-440e-82d3-b13d1d334569\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " Apr 20 17:56:38.589924 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.589851 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-bundle\") pod \"f723f7ce-e870-440e-82d3-b13d1d334569\" (UID: \"f723f7ce-e870-440e-82d3-b13d1d334569\") " Apr 20 17:56:38.590926 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.590896 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-bundle" (OuterVolumeSpecName: "bundle") pod "f723f7ce-e870-440e-82d3-b13d1d334569" (UID: "f723f7ce-e870-440e-82d3-b13d1d334569"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:56:38.591919 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.591898 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f723f7ce-e870-440e-82d3-b13d1d334569-kube-api-access-4pgvq" (OuterVolumeSpecName: "kube-api-access-4pgvq") pod "f723f7ce-e870-440e-82d3-b13d1d334569" (UID: "f723f7ce-e870-440e-82d3-b13d1d334569"). InnerVolumeSpecName "kube-api-access-4pgvq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:56:38.595231 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.595210 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-util" (OuterVolumeSpecName: "util") pod "f723f7ce-e870-440e-82d3-b13d1d334569" (UID: "f723f7ce-e870-440e-82d3-b13d1d334569"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:56:38.690645 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.690555 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:56:38.690645 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.690587 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f723f7ce-e870-440e-82d3-b13d1d334569-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:56:38.690645 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:38.690596 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4pgvq\" (UniqueName: \"kubernetes.io/projected/f723f7ce-e870-440e-82d3-b13d1d334569-kube-api-access-4pgvq\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:56:39.379734 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:39.379699 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" event={"ID":"f723f7ce-e870-440e-82d3-b13d1d334569","Type":"ContainerDied","Data":"c4503b8791e2005f3df0e48af0767669cdcb599cf5a77a11a296d5ed07013881"} Apr 20 17:56:39.379734 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:39.379736 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4503b8791e2005f3df0e48af0767669cdcb599cf5a77a11a296d5ed07013881" Apr 20 17:56:39.379734 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:56:39.379736 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qjxp7" Apr 20 17:57:34.847408 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.847330 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd"] Apr 20 17:57:34.847970 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.847867 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f723f7ce-e870-440e-82d3-b13d1d334569" containerName="extract" Apr 20 17:57:34.847970 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.847888 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723f7ce-e870-440e-82d3-b13d1d334569" containerName="extract" Apr 20 17:57:34.847970 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.847922 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f723f7ce-e870-440e-82d3-b13d1d334569" containerName="pull" Apr 20 17:57:34.847970 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.847931 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723f7ce-e870-440e-82d3-b13d1d334569" containerName="pull" Apr 20 17:57:34.847970 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.847949 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f723f7ce-e870-440e-82d3-b13d1d334569" containerName="util" Apr 20 17:57:34.847970 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.847957 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723f7ce-e870-440e-82d3-b13d1d334569" containerName="util" Apr 20 17:57:34.848326 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.848067 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f723f7ce-e870-440e-82d3-b13d1d334569" containerName="extract" Apr 20 17:57:34.851300 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.851280 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:34.853521 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.853500 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 17:57:34.853648 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.853591 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 17:57:34.854374 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.854360 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-tbdkf\"" Apr 20 17:57:34.859154 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.858565 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd"] Apr 20 17:57:34.964494 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.964450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:34.964683 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.964508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:34.964683 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:34.964548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqp2b\" (UniqueName: \"kubernetes.io/projected/a395cb58-561d-4a66-a684-4e96d4cb3db5-kube-api-access-pqp2b\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:35.065496 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.065461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:35.065688 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.065504 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:35.065688 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.065532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqp2b\" (UniqueName: \"kubernetes.io/projected/a395cb58-561d-4a66-a684-4e96d4cb3db5-kube-api-access-pqp2b\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:35.065839 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.065817 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:35.065933 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.065912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:35.073884 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.073854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqp2b\" (UniqueName: \"kubernetes.io/projected/a395cb58-561d-4a66-a684-4e96d4cb3db5-kube-api-access-pqp2b\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:35.160847 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.160767 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:35.304656 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.304633 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd"] Apr 20 17:57:35.307357 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:57:35.307329 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda395cb58_561d_4a66_a684_4e96d4cb3db5.slice/crio-40c77248b54c30ce2ce074ca5abb9d6e61efbc846fb5797b7788b2e2eea14c20 WatchSource:0}: Error finding container 40c77248b54c30ce2ce074ca5abb9d6e61efbc846fb5797b7788b2e2eea14c20: Status 404 returned error can't find the container with id 40c77248b54c30ce2ce074ca5abb9d6e61efbc846fb5797b7788b2e2eea14c20 Apr 20 17:57:35.453794 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.453766 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb"] Apr 20 17:57:35.457358 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.457343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.464700 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.464676 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb"] Apr 20 17:57:35.588363 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.588334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlvkt\" (UniqueName: \"kubernetes.io/projected/5328da85-247b-4778-89e8-12410591e7f3-kube-api-access-tlvkt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.588536 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.588413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.588536 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.588469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.590812 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.590785 2577 generic.go:358] "Generic (PLEG): container finished" podID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerID="4a8384d8f49f9798e30c76fb5c11d700b5e4fce7bced100e0d9c3e0841e69d14" exitCode=0 Apr 20 17:57:35.590925 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.590872 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" event={"ID":"a395cb58-561d-4a66-a684-4e96d4cb3db5","Type":"ContainerDied","Data":"4a8384d8f49f9798e30c76fb5c11d700b5e4fce7bced100e0d9c3e0841e69d14"} Apr 20 17:57:35.590925 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.590901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" event={"ID":"a395cb58-561d-4a66-a684-4e96d4cb3db5","Type":"ContainerStarted","Data":"40c77248b54c30ce2ce074ca5abb9d6e61efbc846fb5797b7788b2e2eea14c20"} Apr 20 17:57:35.689678 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.689578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.689678 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.689628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.689678 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.689679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlvkt\" (UniqueName: \"kubernetes.io/projected/5328da85-247b-4778-89e8-12410591e7f3-kube-api-access-tlvkt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.689966 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.689953 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.690072 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.690057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.698247 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.698228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlvkt\" (UniqueName: \"kubernetes.io/projected/5328da85-247b-4778-89e8-12410591e7f3-kube-api-access-tlvkt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:35.775444 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:35.775411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:36.052268 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.052236 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r"] Apr 20 17:57:36.057033 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.057016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.064748 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.064718 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r"] Apr 20 17:57:36.104082 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.104054 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb"] Apr 20 17:57:36.105642 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:57:36.105602 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5328da85_247b_4778_89e8_12410591e7f3.slice/crio-036ef7d22637027a55413950232e119fc40bcd6c19c24a05d04d11977534501b WatchSource:0}: Error finding container 036ef7d22637027a55413950232e119fc40bcd6c19c24a05d04d11977534501b: Status 404 returned error can't find the container with id 036ef7d22637027a55413950232e119fc40bcd6c19c24a05d04d11977534501b Apr 20 17:57:36.194174 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.194138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727kp\" (UniqueName: \"kubernetes.io/projected/6c1b2d85-15bd-467f-8571-646e3db81e8a-kube-api-access-727kp\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.194349 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.194192 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.194349 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.194237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.295529 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.295491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-727kp\" (UniqueName: \"kubernetes.io/projected/6c1b2d85-15bd-467f-8571-646e3db81e8a-kube-api-access-727kp\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.295529 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.295540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.295779 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.295563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.295894 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.295873 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.295951 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.295912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.304345 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.304276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-727kp\" (UniqueName: \"kubernetes.io/projected/6c1b2d85-15bd-467f-8571-646e3db81e8a-kube-api-access-727kp\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.367240 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.367209 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:36.459064 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.459040 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp"] Apr 20 17:57:36.464126 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.464098 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.470685 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.470658 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp"] Apr 20 17:57:36.506552 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.506522 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r"] Apr 20 17:57:36.508472 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:57:36.508448 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c1b2d85_15bd_467f_8571_646e3db81e8a.slice/crio-4562606ba75a1042069065fbb2aa15fd373f7f51ae9ec31e1f1f2caf421a15aa WatchSource:0}: Error finding container 4562606ba75a1042069065fbb2aa15fd373f7f51ae9ec31e1f1f2caf421a15aa: Status 404 returned error can't find the container with id 4562606ba75a1042069065fbb2aa15fd373f7f51ae9ec31e1f1f2caf421a15aa Apr 20 17:57:36.595592 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.595557 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" event={"ID":"6c1b2d85-15bd-467f-8571-646e3db81e8a","Type":"ContainerStarted","Data":"4562606ba75a1042069065fbb2aa15fd373f7f51ae9ec31e1f1f2caf421a15aa"} Apr 20 17:57:36.596967 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.596938 2577 generic.go:358] "Generic (PLEG): container finished" podID="5328da85-247b-4778-89e8-12410591e7f3" containerID="a1de3732b6a9cb49bd5fd34203d07ff7e4b94e5ae74a1da4c2b814adce298180" exitCode=0 Apr 20 17:57:36.597091 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.597020 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" event={"ID":"5328da85-247b-4778-89e8-12410591e7f3","Type":"ContainerDied","Data":"a1de3732b6a9cb49bd5fd34203d07ff7e4b94e5ae74a1da4c2b814adce298180"} Apr 20 17:57:36.597091 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.597060 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" event={"ID":"5328da85-247b-4778-89e8-12410591e7f3","Type":"ContainerStarted","Data":"036ef7d22637027a55413950232e119fc40bcd6c19c24a05d04d11977534501b"} Apr 20 17:57:36.597365 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.597345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.597462 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.597394 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42xzz\" (UniqueName: \"kubernetes.io/projected/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-kube-api-access-42xzz\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.597547 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.597524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.598868 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.598849 2577 generic.go:358] "Generic (PLEG): container finished" podID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerID="5586c7c045aa04d0111d740270110c4d4a3d0c387a4d8cc4e5ef407c1bf4979a" exitCode=0 Apr 20 17:57:36.598966 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.598936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" event={"ID":"a395cb58-561d-4a66-a684-4e96d4cb3db5","Type":"ContainerDied","Data":"5586c7c045aa04d0111d740270110c4d4a3d0c387a4d8cc4e5ef407c1bf4979a"} Apr 20 17:57:36.698619 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.698588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.698777 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.698665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.698777 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.698698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42xzz\" (UniqueName: \"kubernetes.io/projected/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-kube-api-access-42xzz\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.699037 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.699015 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.699101 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.699067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.707815 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.707797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42xzz\" (UniqueName: \"kubernetes.io/projected/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-kube-api-access-42xzz\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.779409 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.779374 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:36.904144 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:36.904113 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp"] Apr 20 17:57:36.905346 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:57:36.905318 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6194d1ad_0fa3_4f55_b3fe_24f471e54b15.slice/crio-4eecf1cbb434c1057a6e344d03bb41a37cd99414b632e851e7cf7cbce75751fc WatchSource:0}: Error finding container 4eecf1cbb434c1057a6e344d03bb41a37cd99414b632e851e7cf7cbce75751fc: Status 404 returned error can't find the container with id 4eecf1cbb434c1057a6e344d03bb41a37cd99414b632e851e7cf7cbce75751fc Apr 20 17:57:37.606926 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:37.606828 2577 generic.go:358] "Generic (PLEG): container finished" podID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerID="94405ebac20e3cc63383f81f9f32f06e9bff6f3caf040d436517eb258cb93b48" exitCode=0 Apr 20 17:57:37.606926 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:37.606909 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" event={"ID":"6c1b2d85-15bd-467f-8571-646e3db81e8a","Type":"ContainerDied","Data":"94405ebac20e3cc63383f81f9f32f06e9bff6f3caf040d436517eb258cb93b48"} Apr 20 17:57:37.608561 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:37.608535 2577 generic.go:358] "Generic (PLEG): container finished" podID="5328da85-247b-4778-89e8-12410591e7f3" containerID="981d1bde77413d2d7584d14b2080a5af7c26d841f31f59914ec769abdd95e2df" exitCode=0 Apr 20 17:57:37.608684 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:37.608566 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" event={"ID":"5328da85-247b-4778-89e8-12410591e7f3","Type":"ContainerDied","Data":"981d1bde77413d2d7584d14b2080a5af7c26d841f31f59914ec769abdd95e2df"} Apr 20 17:57:37.610537 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:37.610513 2577 generic.go:358] "Generic (PLEG): container finished" podID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerID="b6c5fca1498f0e182c435025257e6911f0ea2b25f6f29bc9640897a7404181f3" exitCode=0 Apr 20 17:57:37.610663 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:37.610541 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" event={"ID":"a395cb58-561d-4a66-a684-4e96d4cb3db5","Type":"ContainerDied","Data":"b6c5fca1498f0e182c435025257e6911f0ea2b25f6f29bc9640897a7404181f3"} Apr 20 17:57:37.612006 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:37.611971 2577 generic.go:358] "Generic (PLEG): container finished" podID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerID="bf7214a94cb8838eaccdc465826a317b851d04a2b9fcdd16b3a118e215f6a51b" exitCode=0 Apr 20 17:57:37.612117 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:37.612082 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" event={"ID":"6194d1ad-0fa3-4f55-b3fe-24f471e54b15","Type":"ContainerDied","Data":"bf7214a94cb8838eaccdc465826a317b851d04a2b9fcdd16b3a118e215f6a51b"} Apr 20 17:57:37.612117 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:37.612113 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" event={"ID":"6194d1ad-0fa3-4f55-b3fe-24f471e54b15","Type":"ContainerStarted","Data":"4eecf1cbb434c1057a6e344d03bb41a37cd99414b632e851e7cf7cbce75751fc"} Apr 20 17:57:38.617590 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.617505 2577 generic.go:358] "Generic (PLEG): container finished" podID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerID="f15a7964f9c6d24a71c523398068e22620fe626638cc445388f3fb4818e50efc" exitCode=0 Apr 20 17:57:38.617590 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.617572 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" event={"ID":"6c1b2d85-15bd-467f-8571-646e3db81e8a","Type":"ContainerDied","Data":"f15a7964f9c6d24a71c523398068e22620fe626638cc445388f3fb4818e50efc"} Apr 20 17:57:38.619664 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.619641 2577 generic.go:358] "Generic (PLEG): container finished" podID="5328da85-247b-4778-89e8-12410591e7f3" containerID="6739e3394e1441f3ed202966a02143b0c6389d78a6f491409626a47af3c8c781" exitCode=0 Apr 20 17:57:38.619753 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.619726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" event={"ID":"5328da85-247b-4778-89e8-12410591e7f3","Type":"ContainerDied","Data":"6739e3394e1441f3ed202966a02143b0c6389d78a6f491409626a47af3c8c781"} Apr 20 17:57:38.621348 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.621327 2577 generic.go:358] "Generic (PLEG): container finished" podID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerID="b65c165ef93781ae596fa0cb04c77337b1bd27ccf0919e3fe615459ddc02fd03" exitCode=0 Apr 20 17:57:38.621422 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.621367 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" event={"ID":"6194d1ad-0fa3-4f55-b3fe-24f471e54b15","Type":"ContainerDied","Data":"b65c165ef93781ae596fa0cb04c77337b1bd27ccf0919e3fe615459ddc02fd03"} Apr 20 17:57:38.750211 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.750186 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:38.919935 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.919863 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqp2b\" (UniqueName: \"kubernetes.io/projected/a395cb58-561d-4a66-a684-4e96d4cb3db5-kube-api-access-pqp2b\") pod \"a395cb58-561d-4a66-a684-4e96d4cb3db5\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " Apr 20 17:57:38.919935 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.919910 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-util\") pod \"a395cb58-561d-4a66-a684-4e96d4cb3db5\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " Apr 20 17:57:38.920143 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.920019 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-bundle\") pod \"a395cb58-561d-4a66-a684-4e96d4cb3db5\" (UID: \"a395cb58-561d-4a66-a684-4e96d4cb3db5\") " Apr 20 17:57:38.920467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.920433 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-bundle" (OuterVolumeSpecName: "bundle") pod "a395cb58-561d-4a66-a684-4e96d4cb3db5" (UID: "a395cb58-561d-4a66-a684-4e96d4cb3db5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:57:38.922139 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.922115 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a395cb58-561d-4a66-a684-4e96d4cb3db5-kube-api-access-pqp2b" (OuterVolumeSpecName: "kube-api-access-pqp2b") pod "a395cb58-561d-4a66-a684-4e96d4cb3db5" (UID: "a395cb58-561d-4a66-a684-4e96d4cb3db5"). InnerVolumeSpecName "kube-api-access-pqp2b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:57:38.925350 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:38.925310 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-util" (OuterVolumeSpecName: "util") pod "a395cb58-561d-4a66-a684-4e96d4cb3db5" (UID: "a395cb58-561d-4a66-a684-4e96d4cb3db5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:57:39.021104 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.021071 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:39.021104 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.021101 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pqp2b\" (UniqueName: \"kubernetes.io/projected/a395cb58-561d-4a66-a684-4e96d4cb3db5-kube-api-access-pqp2b\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:39.021104 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.021111 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a395cb58-561d-4a66-a684-4e96d4cb3db5-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:39.627729 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.627644 2577 generic.go:358] "Generic (PLEG): container finished" podID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerID="6d551e85b89c94f38c3c5580b52ac68ca62ff6e3516ef57766e70ebb42046761" exitCode=0 Apr 20 17:57:39.628183 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.627734 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" event={"ID":"6c1b2d85-15bd-467f-8571-646e3db81e8a","Type":"ContainerDied","Data":"6d551e85b89c94f38c3c5580b52ac68ca62ff6e3516ef57766e70ebb42046761"} Apr 20 17:57:39.629412 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.629389 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" event={"ID":"a395cb58-561d-4a66-a684-4e96d4cb3db5","Type":"ContainerDied","Data":"40c77248b54c30ce2ce074ca5abb9d6e61efbc846fb5797b7788b2e2eea14c20"} Apr 20 17:57:39.629524 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.629416 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40c77248b54c30ce2ce074ca5abb9d6e61efbc846fb5797b7788b2e2eea14c20" Apr 20 17:57:39.629524 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.629434 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd" Apr 20 17:57:39.631382 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.631350 2577 generic.go:358] "Generic (PLEG): container finished" podID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerID="5e8e8e18a845dbaa348e92220344b628d3ad71fe4bd7635e420aa29605c7d226" exitCode=0 Apr 20 17:57:39.631492 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.631431 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" event={"ID":"6194d1ad-0fa3-4f55-b3fe-24f471e54b15","Type":"ContainerDied","Data":"5e8e8e18a845dbaa348e92220344b628d3ad71fe4bd7635e420aa29605c7d226"} Apr 20 17:57:39.758611 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.758590 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:39.928416 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.928332 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-bundle\") pod \"5328da85-247b-4778-89e8-12410591e7f3\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " Apr 20 17:57:39.928416 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.928380 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-util\") pod \"5328da85-247b-4778-89e8-12410591e7f3\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " Apr 20 17:57:39.928629 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.928441 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlvkt\" (UniqueName: \"kubernetes.io/projected/5328da85-247b-4778-89e8-12410591e7f3-kube-api-access-tlvkt\") pod \"5328da85-247b-4778-89e8-12410591e7f3\" (UID: \"5328da85-247b-4778-89e8-12410591e7f3\") " Apr 20 17:57:39.928814 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.928791 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-bundle" (OuterVolumeSpecName: "bundle") pod "5328da85-247b-4778-89e8-12410591e7f3" (UID: "5328da85-247b-4778-89e8-12410591e7f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:57:39.930478 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.930454 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5328da85-247b-4778-89e8-12410591e7f3-kube-api-access-tlvkt" (OuterVolumeSpecName: "kube-api-access-tlvkt") pod "5328da85-247b-4778-89e8-12410591e7f3" (UID: "5328da85-247b-4778-89e8-12410591e7f3"). InnerVolumeSpecName "kube-api-access-tlvkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:57:39.933064 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:39.933037 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-util" (OuterVolumeSpecName: "util") pod "5328da85-247b-4778-89e8-12410591e7f3" (UID: "5328da85-247b-4778-89e8-12410591e7f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:57:40.029045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.029013 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:40.029045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.029040 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5328da85-247b-4778-89e8-12410591e7f3-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:40.029045 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.029052 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tlvkt\" (UniqueName: \"kubernetes.io/projected/5328da85-247b-4778-89e8-12410591e7f3-kube-api-access-tlvkt\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:40.636789 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.636760 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" Apr 20 17:57:40.636789 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.636777 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb" event={"ID":"5328da85-247b-4778-89e8-12410591e7f3","Type":"ContainerDied","Data":"036ef7d22637027a55413950232e119fc40bcd6c19c24a05d04d11977534501b"} Apr 20 17:57:40.637232 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.636812 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="036ef7d22637027a55413950232e119fc40bcd6c19c24a05d04d11977534501b" Apr 20 17:57:40.772920 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.772896 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:40.806151 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.806130 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:40.936467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.936363 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-bundle\") pod \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " Apr 20 17:57:40.936467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.936421 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-727kp\" (UniqueName: \"kubernetes.io/projected/6c1b2d85-15bd-467f-8571-646e3db81e8a-kube-api-access-727kp\") pod \"6c1b2d85-15bd-467f-8571-646e3db81e8a\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " Apr 20 17:57:40.936467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.936463 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-bundle\") pod \"6c1b2d85-15bd-467f-8571-646e3db81e8a\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " Apr 20 17:57:40.936467 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.936480 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-util\") pod \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " Apr 20 17:57:40.936800 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.936512 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-util\") pod \"6c1b2d85-15bd-467f-8571-646e3db81e8a\" (UID: \"6c1b2d85-15bd-467f-8571-646e3db81e8a\") " Apr 20 17:57:40.936800 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.936558 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42xzz\" (UniqueName: \"kubernetes.io/projected/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-kube-api-access-42xzz\") pod \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\" (UID: \"6194d1ad-0fa3-4f55-b3fe-24f471e54b15\") " Apr 20 17:57:40.937039 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.936977 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-bundle" (OuterVolumeSpecName: "bundle") pod "6194d1ad-0fa3-4f55-b3fe-24f471e54b15" (UID: "6194d1ad-0fa3-4f55-b3fe-24f471e54b15"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:57:40.937222 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.937203 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-bundle" (OuterVolumeSpecName: "bundle") pod "6c1b2d85-15bd-467f-8571-646e3db81e8a" (UID: "6c1b2d85-15bd-467f-8571-646e3db81e8a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:57:40.938834 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.938803 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1b2d85-15bd-467f-8571-646e3db81e8a-kube-api-access-727kp" (OuterVolumeSpecName: "kube-api-access-727kp") pod "6c1b2d85-15bd-467f-8571-646e3db81e8a" (UID: "6c1b2d85-15bd-467f-8571-646e3db81e8a"). InnerVolumeSpecName "kube-api-access-727kp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:57:40.938935 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.938849 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-kube-api-access-42xzz" (OuterVolumeSpecName: "kube-api-access-42xzz") pod "6194d1ad-0fa3-4f55-b3fe-24f471e54b15" (UID: "6194d1ad-0fa3-4f55-b3fe-24f471e54b15"). InnerVolumeSpecName "kube-api-access-42xzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:57:40.944818 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.944785 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-util" (OuterVolumeSpecName: "util") pod "6c1b2d85-15bd-467f-8571-646e3db81e8a" (UID: "6c1b2d85-15bd-467f-8571-646e3db81e8a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:57:40.945228 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:40.945199 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-util" (OuterVolumeSpecName: "util") pod "6194d1ad-0fa3-4f55-b3fe-24f471e54b15" (UID: "6194d1ad-0fa3-4f55-b3fe-24f471e54b15"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:57:41.037815 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.037777 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:41.037815 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.037811 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-42xzz\" (UniqueName: \"kubernetes.io/projected/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-kube-api-access-42xzz\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:41.038048 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.037827 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:41.038048 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.037840 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-727kp\" (UniqueName: \"kubernetes.io/projected/6c1b2d85-15bd-467f-8571-646e3db81e8a-kube-api-access-727kp\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:41.038048 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.037850 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1b2d85-15bd-467f-8571-646e3db81e8a-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:41.038048 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.037861 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6194d1ad-0fa3-4f55-b3fe-24f471e54b15-util\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:57:41.642469 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.642443 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" Apr 20 17:57:41.642469 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.642454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp" event={"ID":"6194d1ad-0fa3-4f55-b3fe-24f471e54b15","Type":"ContainerDied","Data":"4eecf1cbb434c1057a6e344d03bb41a37cd99414b632e851e7cf7cbce75751fc"} Apr 20 17:57:41.642927 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.642486 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eecf1cbb434c1057a6e344d03bb41a37cd99414b632e851e7cf7cbce75751fc" Apr 20 17:57:41.644146 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.644118 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" event={"ID":"6c1b2d85-15bd-467f-8571-646e3db81e8a","Type":"ContainerDied","Data":"4562606ba75a1042069065fbb2aa15fd373f7f51ae9ec31e1f1f2caf421a15aa"} Apr 20 17:57:41.644146 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.644146 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r" Apr 20 17:57:41.644291 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:41.644152 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4562606ba75a1042069065fbb2aa15fd373f7f51ae9ec31e1f1f2caf421a15aa" Apr 20 17:57:51.119777 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.119745 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7464d6c56c-49d7x"] Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120097 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5328da85-247b-4778-89e8-12410591e7f3" containerName="pull" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120108 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5328da85-247b-4778-89e8-12410591e7f3" containerName="pull" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120115 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerName="pull" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120120 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerName="pull" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120129 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerName="extract" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120135 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerName="extract" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120144 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerName="pull" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120149 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerName="pull" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120156 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5328da85-247b-4778-89e8-12410591e7f3" containerName="extract" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120161 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5328da85-247b-4778-89e8-12410591e7f3" containerName="extract" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120168 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerName="extract" Apr 20 17:57:51.120168 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120173 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerName="extract" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120181 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerName="util" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120185 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerName="util" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120191 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerName="util" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120196 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerName="util" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120202 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerName="extract" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120206 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerName="extract" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120211 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerName="util" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120216 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerName="util" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120223 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerName="pull" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120227 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerName="pull" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120233 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5328da85-247b-4778-89e8-12410591e7f3" containerName="util" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120237 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5328da85-247b-4778-89e8-12410591e7f3" containerName="util" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120286 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6194d1ad-0fa3-4f55-b3fe-24f471e54b15" containerName="extract" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120296 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c1b2d85-15bd-467f-8571-646e3db81e8a" containerName="extract" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120303 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5328da85-247b-4778-89e8-12410591e7f3" containerName="extract" Apr 20 17:57:51.120509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.120309 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a395cb58-561d-4a66-a684-4e96d4cb3db5" containerName="extract" Apr 20 17:57:51.124692 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.124674 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.135693 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.135664 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7464d6c56c-49d7x"] Apr 20 17:57:51.217791 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.217759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-trusted-ca-bundle\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.218014 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.217812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-oauth-serving-cert\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.218014 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.217857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-console-serving-cert\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.218014 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.217891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcfbw\" (UniqueName: \"kubernetes.io/projected/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-kube-api-access-mcfbw\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.218014 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.217917 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-console-oauth-config\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.218014 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.217941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-console-config\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.218014 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.217959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-service-ca\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.319052 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-oauth-serving-cert\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.319219 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-console-serving-cert\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.319219 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcfbw\" (UniqueName: \"kubernetes.io/projected/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-kube-api-access-mcfbw\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.319219 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-console-oauth-config\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.319219 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319175 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-console-config\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.319219 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-service-ca\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.319505 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-trusted-ca-bundle\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.319929 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-oauth-serving-cert\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.319929 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319914 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-console-config\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.320136 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.319963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-service-ca\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.320136 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.320051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-trusted-ca-bundle\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.321656 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.321628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-console-serving-cert\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.321755 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.321680 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-console-oauth-config\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.327365 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.327340 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcfbw\" (UniqueName: \"kubernetes.io/projected/7960da98-e4cc-49e6-9e3b-8ccd24ad9f44-kube-api-access-mcfbw\") pod \"console-7464d6c56c-49d7x\" (UID: \"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44\") " pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.434220 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.434134 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:57:51.568057 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.568023 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7464d6c56c-49d7x"] Apr 20 17:57:51.572205 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:57:51.572173 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7960da98_e4cc_49e6_9e3b_8ccd24ad9f44.slice/crio-bab2ce184c8232cdc13cf01b86563959ca3abd94eb76b3e8d1330fcccf3619f2 WatchSource:0}: Error finding container bab2ce184c8232cdc13cf01b86563959ca3abd94eb76b3e8d1330fcccf3619f2: Status 404 returned error can't find the container with id bab2ce184c8232cdc13cf01b86563959ca3abd94eb76b3e8d1330fcccf3619f2 Apr 20 17:57:51.683092 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.683055 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7464d6c56c-49d7x" event={"ID":"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44","Type":"ContainerStarted","Data":"9a3eedc5303e580455016aa2dc9539c04d7290c56a136f44630b0464af731d64"} Apr 20 17:57:51.683092 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.683090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7464d6c56c-49d7x" event={"ID":"7960da98-e4cc-49e6-9e3b-8ccd24ad9f44","Type":"ContainerStarted","Data":"bab2ce184c8232cdc13cf01b86563959ca3abd94eb76b3e8d1330fcccf3619f2"} Apr 20 17:57:51.713740 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:57:51.713674 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7464d6c56c-49d7x" podStartSLOduration=0.713655604 podStartE2EDuration="713.655604ms" podCreationTimestamp="2026-04-20 17:57:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:57:51.712685146 +0000 UTC m=+573.498040385" watchObservedRunningTime="2026-04-20 17:57:51.713655604 +0000 UTC m=+573.499010859" Apr 20 17:58:01.434352 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:01.434316 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:58:01.434352 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:01.434357 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:58:01.439155 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:01.439133 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:58:01.723505 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:01.723477 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7464d6c56c-49d7x" Apr 20 17:58:01.796766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:01.796735 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-684869d99c-km9fq"] Apr 20 17:58:14.734852 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.734817 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6"] Apr 20 17:58:14.738216 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.738200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:14.740492 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.740468 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 17:58:14.740636 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.740534 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 17:58:14.740636 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.740534 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 17:58:14.741399 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.741381 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 17:58:14.741524 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.741419 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-tbdkf\"" Apr 20 17:58:14.746032 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.746011 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6"] Apr 20 17:58:14.822622 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.822584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bff6\" (UniqueName: \"kubernetes.io/projected/5b245b6c-efe4-4d57-bce9-c26cf49dab9f-kube-api-access-2bff6\") pod \"kuadrant-console-plugin-6cb54b5c86-tj2t6\" (UID: \"5b245b6c-efe4-4d57-bce9-c26cf49dab9f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:14.822622 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.822631 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b245b6c-efe4-4d57-bce9-c26cf49dab9f-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-tj2t6\" (UID: \"5b245b6c-efe4-4d57-bce9-c26cf49dab9f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:14.822871 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.822719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b245b6c-efe4-4d57-bce9-c26cf49dab9f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tj2t6\" (UID: \"5b245b6c-efe4-4d57-bce9-c26cf49dab9f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:14.923971 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.923942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bff6\" (UniqueName: \"kubernetes.io/projected/5b245b6c-efe4-4d57-bce9-c26cf49dab9f-kube-api-access-2bff6\") pod \"kuadrant-console-plugin-6cb54b5c86-tj2t6\" (UID: \"5b245b6c-efe4-4d57-bce9-c26cf49dab9f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:14.924174 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.923978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b245b6c-efe4-4d57-bce9-c26cf49dab9f-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-tj2t6\" (UID: \"5b245b6c-efe4-4d57-bce9-c26cf49dab9f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:14.924174 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.924064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b245b6c-efe4-4d57-bce9-c26cf49dab9f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tj2t6\" (UID: \"5b245b6c-efe4-4d57-bce9-c26cf49dab9f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:14.924841 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.924823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b245b6c-efe4-4d57-bce9-c26cf49dab9f-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-tj2t6\" (UID: \"5b245b6c-efe4-4d57-bce9-c26cf49dab9f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:14.926644 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.926622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b245b6c-efe4-4d57-bce9-c26cf49dab9f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tj2t6\" (UID: \"5b245b6c-efe4-4d57-bce9-c26cf49dab9f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:14.931643 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:14.931623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bff6\" (UniqueName: \"kubernetes.io/projected/5b245b6c-efe4-4d57-bce9-c26cf49dab9f-kube-api-access-2bff6\") pod \"kuadrant-console-plugin-6cb54b5c86-tj2t6\" (UID: \"5b245b6c-efe4-4d57-bce9-c26cf49dab9f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:15.048887 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:15.048803 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" Apr 20 17:58:15.173104 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:15.173080 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6"] Apr 20 17:58:15.175255 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:58:15.175228 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b245b6c_efe4_4d57_bce9_c26cf49dab9f.slice/crio-5ec7efb3a037f9ac55adc2afa628f3832140850fbc17b7fc9fd42be1a86b932d WatchSource:0}: Error finding container 5ec7efb3a037f9ac55adc2afa628f3832140850fbc17b7fc9fd42be1a86b932d: Status 404 returned error can't find the container with id 5ec7efb3a037f9ac55adc2afa628f3832140850fbc17b7fc9fd42be1a86b932d Apr 20 17:58:15.773510 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:15.773472 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" event={"ID":"5b245b6c-efe4-4d57-bce9-c26cf49dab9f","Type":"ContainerStarted","Data":"5ec7efb3a037f9ac55adc2afa628f3832140850fbc17b7fc9fd42be1a86b932d"} Apr 20 17:58:19.950413 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:19.950382 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 17:58:19.950875 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:19.950802 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 17:58:26.816555 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:26.816507 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-684869d99c-km9fq" podUID="bcafce03-8ee3-495c-b3eb-147db98319b1" containerName="console" containerID="cri-o://921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5" gracePeriod=15 Apr 20 17:58:36.812702 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:36.812591 2577 patch_prober.go:28] interesting pod/console-684869d99c-km9fq container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.26:8443/health\": dial tcp 10.133.0.26:8443: connect: connection refused" start-of-body= Apr 20 17:58:36.813106 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:36.812668 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-684869d99c-km9fq" podUID="bcafce03-8ee3-495c-b3eb-147db98319b1" containerName="console" probeResult="failure" output="Get \"https://10.133.0.26:8443/health\": dial tcp 10.133.0.26:8443: connect: connection refused" Apr 20 17:58:38.540350 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.540322 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-684869d99c-km9fq_bcafce03-8ee3-495c-b3eb-147db98319b1/console/0.log" Apr 20 17:58:38.540680 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.540385 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:58:38.645296 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645215 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-trusted-ca-bundle\") pod \"bcafce03-8ee3-495c-b3eb-147db98319b1\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " Apr 20 17:58:38.645296 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645262 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-service-ca\") pod \"bcafce03-8ee3-495c-b3eb-147db98319b1\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " Apr 20 17:58:38.645296 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645294 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-console-config\") pod \"bcafce03-8ee3-495c-b3eb-147db98319b1\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " Apr 20 17:58:38.645548 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645315 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-oauth-serving-cert\") pod \"bcafce03-8ee3-495c-b3eb-147db98319b1\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " Apr 20 17:58:38.645548 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645340 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-serving-cert\") pod \"bcafce03-8ee3-495c-b3eb-147db98319b1\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " Apr 20 17:58:38.645548 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645355 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-oauth-config\") pod \"bcafce03-8ee3-495c-b3eb-147db98319b1\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " Apr 20 17:58:38.645548 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645465 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6zwk\" (UniqueName: \"kubernetes.io/projected/bcafce03-8ee3-495c-b3eb-147db98319b1-kube-api-access-w6zwk\") pod \"bcafce03-8ee3-495c-b3eb-147db98319b1\" (UID: \"bcafce03-8ee3-495c-b3eb-147db98319b1\") " Apr 20 17:58:38.645752 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645694 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bcafce03-8ee3-495c-b3eb-147db98319b1" (UID: "bcafce03-8ee3-495c-b3eb-147db98319b1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:58:38.645818 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645743 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-console-config" (OuterVolumeSpecName: "console-config") pod "bcafce03-8ee3-495c-b3eb-147db98319b1" (UID: "bcafce03-8ee3-495c-b3eb-147db98319b1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:58:38.645818 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645757 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bcafce03-8ee3-495c-b3eb-147db98319b1" (UID: "bcafce03-8ee3-495c-b3eb-147db98319b1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:58:38.645818 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.645750 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-service-ca" (OuterVolumeSpecName: "service-ca") pod "bcafce03-8ee3-495c-b3eb-147db98319b1" (UID: "bcafce03-8ee3-495c-b3eb-147db98319b1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:58:38.647789 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.647762 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcafce03-8ee3-495c-b3eb-147db98319b1-kube-api-access-w6zwk" (OuterVolumeSpecName: "kube-api-access-w6zwk") pod "bcafce03-8ee3-495c-b3eb-147db98319b1" (UID: "bcafce03-8ee3-495c-b3eb-147db98319b1"). InnerVolumeSpecName "kube-api-access-w6zwk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:58:38.648023 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.647999 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bcafce03-8ee3-495c-b3eb-147db98319b1" (UID: "bcafce03-8ee3-495c-b3eb-147db98319b1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:58:38.648023 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.648016 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bcafce03-8ee3-495c-b3eb-147db98319b1" (UID: "bcafce03-8ee3-495c-b3eb-147db98319b1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:58:38.747214 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.747182 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-trusted-ca-bundle\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:58:38.747214 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.747211 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-service-ca\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:58:38.747416 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.747224 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-console-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:58:38.747416 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.747236 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcafce03-8ee3-495c-b3eb-147db98319b1-oauth-serving-cert\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:58:38.747416 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.747246 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-serving-cert\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:58:38.747416 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.747255 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcafce03-8ee3-495c-b3eb-147db98319b1-console-oauth-config\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:58:38.747416 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.747263 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w6zwk\" (UniqueName: \"kubernetes.io/projected/bcafce03-8ee3-495c-b3eb-147db98319b1-kube-api-access-w6zwk\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:58:38.875950 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.875928 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-684869d99c-km9fq_bcafce03-8ee3-495c-b3eb-147db98319b1/console/0.log" Apr 20 17:58:38.876099 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.875995 2577 generic.go:358] "Generic (PLEG): container finished" podID="bcafce03-8ee3-495c-b3eb-147db98319b1" containerID="921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5" exitCode=2 Apr 20 17:58:38.876099 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.876035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-684869d99c-km9fq" event={"ID":"bcafce03-8ee3-495c-b3eb-147db98319b1","Type":"ContainerDied","Data":"921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5"} Apr 20 17:58:38.876099 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.876077 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-684869d99c-km9fq" Apr 20 17:58:38.876099 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.876093 2577 scope.go:117] "RemoveContainer" containerID="921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5" Apr 20 17:58:38.876256 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.876080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-684869d99c-km9fq" event={"ID":"bcafce03-8ee3-495c-b3eb-147db98319b1","Type":"ContainerDied","Data":"ead3dc2da8dd6675026418dcfb8b786caec9458dd126b385be1e4a4206a7760f"} Apr 20 17:58:38.884708 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.884678 2577 scope.go:117] "RemoveContainer" containerID="921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5" Apr 20 17:58:38.885017 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:58:38.884994 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5\": container with ID starting with 921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5 not found: ID does not exist" containerID="921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5" Apr 20 17:58:38.885088 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.885029 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5"} err="failed to get container status \"921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5\": rpc error: code = NotFound desc = could not find container \"921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5\": container with ID starting with 921b7bb28fa039f9106b61bbcb7ecaefe85033ff6ebd66b369ff4562f510b4e5 not found: ID does not exist" Apr 20 17:58:38.896355 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.896334 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-684869d99c-km9fq"] Apr 20 17:58:38.899725 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:38.899704 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-684869d99c-km9fq"] Apr 20 17:58:39.881942 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:39.881906 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" event={"ID":"5b245b6c-efe4-4d57-bce9-c26cf49dab9f","Type":"ContainerStarted","Data":"dca1dea802c2550055b696299d19d24b14e20d5317b44eac83c210e8d6916fbc"} Apr 20 17:58:39.896473 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:39.896377 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tj2t6" podStartSLOduration=2.288946259 podStartE2EDuration="25.896363422s" podCreationTimestamp="2026-04-20 17:58:14 +0000 UTC" firstStartedPulling="2026-04-20 17:58:15.176841607 +0000 UTC m=+596.962196829" lastFinishedPulling="2026-04-20 17:58:38.784258775 +0000 UTC m=+620.569613992" observedRunningTime="2026-04-20 17:58:39.896224676 +0000 UTC m=+621.681579915" watchObservedRunningTime="2026-04-20 17:58:39.896363422 +0000 UTC m=+621.681718664" Apr 20 17:58:40.696745 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:58:40.696703 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcafce03-8ee3-495c-b3eb-147db98319b1" path="/var/lib/kubelet/pods/bcafce03-8ee3-495c-b3eb-147db98319b1/volumes" Apr 20 17:59:11.351196 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.351160 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pxc64"] Apr 20 17:59:11.351709 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.351668 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcafce03-8ee3-495c-b3eb-147db98319b1" containerName="console" Apr 20 17:59:11.351709 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.351684 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcafce03-8ee3-495c-b3eb-147db98319b1" containerName="console" Apr 20 17:59:11.351901 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.351881 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcafce03-8ee3-495c-b3eb-147db98319b1" containerName="console" Apr 20 17:59:11.422215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.422184 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pxc64"] Apr 20 17:59:11.422398 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.422288 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" Apr 20 17:59:11.424636 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.424609 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-ts9p4\"" Apr 20 17:59:11.525020 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.524972 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-tcx26"] Apr 20 17:59:11.528520 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.528501 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tcx26" Apr 20 17:59:11.535097 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.535063 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcx26"] Apr 20 17:59:11.535298 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.535278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8hx2\" (UniqueName: \"kubernetes.io/projected/e0041ef7-9fba-46ad-8932-576b5b4c8eaf-kube-api-access-j8hx2\") pod \"authorino-f99f4b5cd-pxc64\" (UID: \"e0041ef7-9fba-46ad-8932-576b5b4c8eaf\") " pod="kuadrant-system/authorino-f99f4b5cd-pxc64" Apr 20 17:59:11.635806 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.635722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6g5q\" (UniqueName: \"kubernetes.io/projected/d97f89fe-c28e-42fa-8da8-d388b44d0c39-kube-api-access-b6g5q\") pod \"authorino-7498df8756-tcx26\" (UID: \"d97f89fe-c28e-42fa-8da8-d388b44d0c39\") " pod="kuadrant-system/authorino-7498df8756-tcx26" Apr 20 17:59:11.635806 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.635771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8hx2\" (UniqueName: \"kubernetes.io/projected/e0041ef7-9fba-46ad-8932-576b5b4c8eaf-kube-api-access-j8hx2\") pod \"authorino-f99f4b5cd-pxc64\" (UID: \"e0041ef7-9fba-46ad-8932-576b5b4c8eaf\") " pod="kuadrant-system/authorino-f99f4b5cd-pxc64" Apr 20 17:59:11.644493 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.644469 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8hx2\" (UniqueName: \"kubernetes.io/projected/e0041ef7-9fba-46ad-8932-576b5b4c8eaf-kube-api-access-j8hx2\") pod \"authorino-f99f4b5cd-pxc64\" (UID: \"e0041ef7-9fba-46ad-8932-576b5b4c8eaf\") " pod="kuadrant-system/authorino-f99f4b5cd-pxc64" Apr 20 17:59:11.732106 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.732080 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" Apr 20 17:59:11.737169 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.737145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6g5q\" (UniqueName: \"kubernetes.io/projected/d97f89fe-c28e-42fa-8da8-d388b44d0c39-kube-api-access-b6g5q\") pod \"authorino-7498df8756-tcx26\" (UID: \"d97f89fe-c28e-42fa-8da8-d388b44d0c39\") " pod="kuadrant-system/authorino-7498df8756-tcx26" Apr 20 17:59:11.745174 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.745145 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6g5q\" (UniqueName: \"kubernetes.io/projected/d97f89fe-c28e-42fa-8da8-d388b44d0c39-kube-api-access-b6g5q\") pod \"authorino-7498df8756-tcx26\" (UID: \"d97f89fe-c28e-42fa-8da8-d388b44d0c39\") " pod="kuadrant-system/authorino-7498df8756-tcx26" Apr 20 17:59:11.838410 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.838380 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tcx26" Apr 20 17:59:11.851742 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.851607 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pxc64"] Apr 20 17:59:11.854440 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:59:11.854399 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0041ef7_9fba_46ad_8932_576b5b4c8eaf.slice/crio-ce90d72470ca622b84cad912a45fdca1540fa6c3e247f66fb66e0f8b6667a8c1 WatchSource:0}: Error finding container ce90d72470ca622b84cad912a45fdca1540fa6c3e247f66fb66e0f8b6667a8c1: Status 404 returned error can't find the container with id ce90d72470ca622b84cad912a45fdca1540fa6c3e247f66fb66e0f8b6667a8c1 Apr 20 17:59:11.856011 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.855980 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 17:59:11.961850 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:11.961826 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcx26"] Apr 20 17:59:11.963889 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:59:11.963858 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd97f89fe_c28e_42fa_8da8_d388b44d0c39.slice/crio-0b5ef5821bc70c4c0aac7d09782c0003e7c4be8ff29474ae8415b7cdb05c0249 WatchSource:0}: Error finding container 0b5ef5821bc70c4c0aac7d09782c0003e7c4be8ff29474ae8415b7cdb05c0249: Status 404 returned error can't find the container with id 0b5ef5821bc70c4c0aac7d09782c0003e7c4be8ff29474ae8415b7cdb05c0249 Apr 20 17:59:12.014878 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:12.014848 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tcx26" event={"ID":"d97f89fe-c28e-42fa-8da8-d388b44d0c39","Type":"ContainerStarted","Data":"0b5ef5821bc70c4c0aac7d09782c0003e7c4be8ff29474ae8415b7cdb05c0249"} Apr 20 17:59:12.015875 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:12.015852 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" event={"ID":"e0041ef7-9fba-46ad-8932-576b5b4c8eaf","Type":"ContainerStarted","Data":"ce90d72470ca622b84cad912a45fdca1540fa6c3e247f66fb66e0f8b6667a8c1"} Apr 20 17:59:16.040085 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:16.040044 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tcx26" event={"ID":"d97f89fe-c28e-42fa-8da8-d388b44d0c39","Type":"ContainerStarted","Data":"06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7"} Apr 20 17:59:16.041404 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:16.041376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" event={"ID":"e0041ef7-9fba-46ad-8932-576b5b4c8eaf","Type":"ContainerStarted","Data":"575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba"} Apr 20 17:59:16.055820 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:16.055770 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-tcx26" podStartSLOduration=1.630507597 podStartE2EDuration="5.05575716s" podCreationTimestamp="2026-04-20 17:59:11 +0000 UTC" firstStartedPulling="2026-04-20 17:59:11.965180222 +0000 UTC m=+653.750535439" lastFinishedPulling="2026-04-20 17:59:15.390429781 +0000 UTC m=+657.175785002" observedRunningTime="2026-04-20 17:59:16.053126171 +0000 UTC m=+657.838481410" watchObservedRunningTime="2026-04-20 17:59:16.05575716 +0000 UTC m=+657.841112399" Apr 20 17:59:16.068701 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:16.068658 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" podStartSLOduration=1.545541068 podStartE2EDuration="5.068645595s" podCreationTimestamp="2026-04-20 17:59:11 +0000 UTC" firstStartedPulling="2026-04-20 17:59:11.856117012 +0000 UTC m=+653.641472234" lastFinishedPulling="2026-04-20 17:59:15.379221544 +0000 UTC m=+657.164576761" observedRunningTime="2026-04-20 17:59:16.06554779 +0000 UTC m=+657.850903030" watchObservedRunningTime="2026-04-20 17:59:16.068645595 +0000 UTC m=+657.854000833" Apr 20 17:59:16.094590 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:16.094562 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pxc64"] Apr 20 17:59:18.049526 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:18.049471 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" podUID="e0041ef7-9fba-46ad-8932-576b5b4c8eaf" containerName="authorino" containerID="cri-o://575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba" gracePeriod=30 Apr 20 17:59:18.289677 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:18.289655 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" Apr 20 17:59:18.406074 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:18.405966 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8hx2\" (UniqueName: \"kubernetes.io/projected/e0041ef7-9fba-46ad-8932-576b5b4c8eaf-kube-api-access-j8hx2\") pod \"e0041ef7-9fba-46ad-8932-576b5b4c8eaf\" (UID: \"e0041ef7-9fba-46ad-8932-576b5b4c8eaf\") " Apr 20 17:59:18.408006 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:18.407954 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0041ef7-9fba-46ad-8932-576b5b4c8eaf-kube-api-access-j8hx2" (OuterVolumeSpecName: "kube-api-access-j8hx2") pod "e0041ef7-9fba-46ad-8932-576b5b4c8eaf" (UID: "e0041ef7-9fba-46ad-8932-576b5b4c8eaf"). InnerVolumeSpecName "kube-api-access-j8hx2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:59:18.507480 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:18.507442 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8hx2\" (UniqueName: \"kubernetes.io/projected/e0041ef7-9fba-46ad-8932-576b5b4c8eaf-kube-api-access-j8hx2\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:59:19.053866 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:19.053832 2577 generic.go:358] "Generic (PLEG): container finished" podID="e0041ef7-9fba-46ad-8932-576b5b4c8eaf" containerID="575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba" exitCode=0 Apr 20 17:59:19.054348 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:19.053888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" event={"ID":"e0041ef7-9fba-46ad-8932-576b5b4c8eaf","Type":"ContainerDied","Data":"575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba"} Apr 20 17:59:19.054348 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:19.053902 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" Apr 20 17:59:19.054348 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:19.053915 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-pxc64" event={"ID":"e0041ef7-9fba-46ad-8932-576b5b4c8eaf","Type":"ContainerDied","Data":"ce90d72470ca622b84cad912a45fdca1540fa6c3e247f66fb66e0f8b6667a8c1"} Apr 20 17:59:19.054348 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:19.053930 2577 scope.go:117] "RemoveContainer" containerID="575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba" Apr 20 17:59:19.062456 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:19.062445 2577 scope.go:117] "RemoveContainer" containerID="575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba" Apr 20 17:59:19.062711 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:59:19.062690 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba\": container with ID starting with 575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba not found: ID does not exist" containerID="575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba" Apr 20 17:59:19.062762 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:19.062720 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba"} err="failed to get container status \"575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba\": rpc error: code = NotFound desc = could not find container \"575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba\": container with ID starting with 575362b51c7409131d0c3e76d8ee7a560e381d6309fcc8036a8ced6d10d31aba not found: ID does not exist" Apr 20 17:59:19.069708 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:19.069686 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pxc64"] Apr 20 17:59:19.073102 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:19.073081 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pxc64"] Apr 20 17:59:20.696264 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:20.696228 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0041ef7-9fba-46ad-8932-576b5b4c8eaf" path="/var/lib/kubelet/pods/e0041ef7-9fba-46ad-8932-576b5b4c8eaf/volumes" Apr 20 17:59:39.327405 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.327366 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-tpd9s"] Apr 20 17:59:39.327886 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.327725 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0041ef7-9fba-46ad-8932-576b5b4c8eaf" containerName="authorino" Apr 20 17:59:39.327886 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.327736 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0041ef7-9fba-46ad-8932-576b5b4c8eaf" containerName="authorino" Apr 20 17:59:39.327886 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.327814 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0041ef7-9fba-46ad-8932-576b5b4c8eaf" containerName="authorino" Apr 20 17:59:39.330247 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.330232 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" Apr 20 17:59:39.339180 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.339151 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-tpd9s"] Apr 20 17:59:39.380599 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.380566 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4vq7\" (UniqueName: \"kubernetes.io/projected/eb1edc88-88bb-46b8-879f-87a591bd9503-kube-api-access-f4vq7\") pod \"authorino-8b475cf9f-tpd9s\" (UID: \"eb1edc88-88bb-46b8-879f-87a591bd9503\") " pod="kuadrant-system/authorino-8b475cf9f-tpd9s" Apr 20 17:59:39.481654 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.481624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4vq7\" (UniqueName: \"kubernetes.io/projected/eb1edc88-88bb-46b8-879f-87a591bd9503-kube-api-access-f4vq7\") pod \"authorino-8b475cf9f-tpd9s\" (UID: \"eb1edc88-88bb-46b8-879f-87a591bd9503\") " pod="kuadrant-system/authorino-8b475cf9f-tpd9s" Apr 20 17:59:39.489303 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.489272 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4vq7\" (UniqueName: \"kubernetes.io/projected/eb1edc88-88bb-46b8-879f-87a591bd9503-kube-api-access-f4vq7\") pod \"authorino-8b475cf9f-tpd9s\" (UID: \"eb1edc88-88bb-46b8-879f-87a591bd9503\") " pod="kuadrant-system/authorino-8b475cf9f-tpd9s" Apr 20 17:59:39.557692 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.557655 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-tpd9s"] Apr 20 17:59:39.557925 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.557911 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" Apr 20 17:59:39.580854 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.580777 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6549bb64c6-njqf8"] Apr 20 17:59:39.584661 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.584636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6549bb64c6-njqf8" Apr 20 17:59:39.592179 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.592145 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6549bb64c6-njqf8"] Apr 20 17:59:39.681919 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.681886 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-tpd9s"] Apr 20 17:59:39.683403 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.683382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9qgv\" (UniqueName: \"kubernetes.io/projected/2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed-kube-api-access-c9qgv\") pod \"authorino-6549bb64c6-njqf8\" (UID: \"2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed\") " pod="kuadrant-system/authorino-6549bb64c6-njqf8" Apr 20 17:59:39.684926 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:59:39.684889 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb1edc88_88bb_46b8_879f_87a591bd9503.slice/crio-6a0c1c403b6c636677cfdb553a56282f853bb58b384b53760865541169691be9 WatchSource:0}: Error finding container 6a0c1c403b6c636677cfdb553a56282f853bb58b384b53760865541169691be9: Status 404 returned error can't find the container with id 6a0c1c403b6c636677cfdb553a56282f853bb58b384b53760865541169691be9 Apr 20 17:59:39.701244 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.701220 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6549bb64c6-njqf8"] Apr 20 17:59:39.701465 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:59:39.701445 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-c9qgv], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-6549bb64c6-njqf8" podUID="2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed" Apr 20 17:59:39.730620 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.730589 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-cbb499d4d-wwhmk"] Apr 20 17:59:39.733709 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.733689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 17:59:39.735970 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.735950 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 17:59:39.739941 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.739920 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-cbb499d4d-wwhmk"] Apr 20 17:59:39.784200 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.784166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbfjh\" (UniqueName: \"kubernetes.io/projected/22a83f7a-9bad-4b2c-bd25-ad0994772b41-kube-api-access-cbfjh\") pod \"authorino-cbb499d4d-wwhmk\" (UID: \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\") " pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 17:59:39.784374 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.784222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qgv\" (UniqueName: \"kubernetes.io/projected/2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed-kube-api-access-c9qgv\") pod \"authorino-6549bb64c6-njqf8\" (UID: \"2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed\") " pod="kuadrant-system/authorino-6549bb64c6-njqf8" Apr 20 17:59:39.784374 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.784299 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/22a83f7a-9bad-4b2c-bd25-ad0994772b41-tls-cert\") pod \"authorino-cbb499d4d-wwhmk\" (UID: \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\") " pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 17:59:39.792659 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.792630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9qgv\" (UniqueName: \"kubernetes.io/projected/2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed-kube-api-access-c9qgv\") pod \"authorino-6549bb64c6-njqf8\" (UID: \"2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed\") " pod="kuadrant-system/authorino-6549bb64c6-njqf8" Apr 20 17:59:39.884890 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.884803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbfjh\" (UniqueName: \"kubernetes.io/projected/22a83f7a-9bad-4b2c-bd25-ad0994772b41-kube-api-access-cbfjh\") pod \"authorino-cbb499d4d-wwhmk\" (UID: \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\") " pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 17:59:39.884890 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.884865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/22a83f7a-9bad-4b2c-bd25-ad0994772b41-tls-cert\") pod \"authorino-cbb499d4d-wwhmk\" (UID: \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\") " pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 17:59:39.887215 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.887187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/22a83f7a-9bad-4b2c-bd25-ad0994772b41-tls-cert\") pod \"authorino-cbb499d4d-wwhmk\" (UID: \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\") " pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 17:59:39.904679 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:39.904657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbfjh\" (UniqueName: \"kubernetes.io/projected/22a83f7a-9bad-4b2c-bd25-ad0994772b41-kube-api-access-cbfjh\") pod \"authorino-cbb499d4d-wwhmk\" (UID: \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\") " pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 17:59:40.044623 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.044595 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 17:59:40.138169 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.138079 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6549bb64c6-njqf8" Apr 20 17:59:40.138309 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.138135 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" podUID="eb1edc88-88bb-46b8-879f-87a591bd9503" containerName="authorino" containerID="cri-o://4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992" gracePeriod=30 Apr 20 17:59:40.138613 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.138057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" event={"ID":"eb1edc88-88bb-46b8-879f-87a591bd9503","Type":"ContainerStarted","Data":"4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992"} Apr 20 17:59:40.138613 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.138447 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" event={"ID":"eb1edc88-88bb-46b8-879f-87a591bd9503","Type":"ContainerStarted","Data":"6a0c1c403b6c636677cfdb553a56282f853bb58b384b53760865541169691be9"} Apr 20 17:59:40.143381 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.143361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6549bb64c6-njqf8" Apr 20 17:59:40.154656 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.154601 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" podStartSLOduration=0.807036376 podStartE2EDuration="1.154587337s" podCreationTimestamp="2026-04-20 17:59:39 +0000 UTC" firstStartedPulling="2026-04-20 17:59:39.686240134 +0000 UTC m=+681.471595351" lastFinishedPulling="2026-04-20 17:59:40.03379109 +0000 UTC m=+681.819146312" observedRunningTime="2026-04-20 17:59:40.153060055 +0000 UTC m=+681.938415294" watchObservedRunningTime="2026-04-20 17:59:40.154587337 +0000 UTC m=+681.939942575" Apr 20 17:59:40.187660 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.187630 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9qgv\" (UniqueName: \"kubernetes.io/projected/2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed-kube-api-access-c9qgv\") pod \"2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed\" (UID: \"2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed\") " Apr 20 17:59:40.189864 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.189809 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed-kube-api-access-c9qgv" (OuterVolumeSpecName: "kube-api-access-c9qgv") pod "2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed" (UID: "2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed"). InnerVolumeSpecName "kube-api-access-c9qgv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:59:40.190334 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.190313 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-cbb499d4d-wwhmk"] Apr 20 17:59:40.213812 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:59:40.213789 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22a83f7a_9bad_4b2c_bd25_ad0994772b41.slice/crio-71eac7503aa547305a9f0df504ffbbfb1648ae623dcf6a519359f22943e0a53f WatchSource:0}: Error finding container 71eac7503aa547305a9f0df504ffbbfb1648ae623dcf6a519359f22943e0a53f: Status 404 returned error can't find the container with id 71eac7503aa547305a9f0df504ffbbfb1648ae623dcf6a519359f22943e0a53f Apr 20 17:59:40.288697 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.288664 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9qgv\" (UniqueName: \"kubernetes.io/projected/2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed-kube-api-access-c9qgv\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:59:40.364835 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.364812 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" Apr 20 17:59:40.389427 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.389359 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4vq7\" (UniqueName: \"kubernetes.io/projected/eb1edc88-88bb-46b8-879f-87a591bd9503-kube-api-access-f4vq7\") pod \"eb1edc88-88bb-46b8-879f-87a591bd9503\" (UID: \"eb1edc88-88bb-46b8-879f-87a591bd9503\") " Apr 20 17:59:40.391337 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.391314 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb1edc88-88bb-46b8-879f-87a591bd9503-kube-api-access-f4vq7" (OuterVolumeSpecName: "kube-api-access-f4vq7") pod "eb1edc88-88bb-46b8-879f-87a591bd9503" (UID: "eb1edc88-88bb-46b8-879f-87a591bd9503"). InnerVolumeSpecName "kube-api-access-f4vq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:59:40.490680 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:40.490651 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4vq7\" (UniqueName: \"kubernetes.io/projected/eb1edc88-88bb-46b8-879f-87a591bd9503-kube-api-access-f4vq7\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:59:41.146682 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.146638 2577 generic.go:358] "Generic (PLEG): container finished" podID="eb1edc88-88bb-46b8-879f-87a591bd9503" containerID="4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992" exitCode=0 Apr 20 17:59:41.146873 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.146717 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" Apr 20 17:59:41.146873 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.146722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" event={"ID":"eb1edc88-88bb-46b8-879f-87a591bd9503","Type":"ContainerDied","Data":"4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992"} Apr 20 17:59:41.146873 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.146830 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-tpd9s" event={"ID":"eb1edc88-88bb-46b8-879f-87a591bd9503","Type":"ContainerDied","Data":"6a0c1c403b6c636677cfdb553a56282f853bb58b384b53760865541169691be9"} Apr 20 17:59:41.146873 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.146850 2577 scope.go:117] "RemoveContainer" containerID="4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992" Apr 20 17:59:41.148216 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.148136 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" event={"ID":"22a83f7a-9bad-4b2c-bd25-ad0994772b41","Type":"ContainerStarted","Data":"6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad"} Apr 20 17:59:41.148216 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.148177 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" event={"ID":"22a83f7a-9bad-4b2c-bd25-ad0994772b41","Type":"ContainerStarted","Data":"71eac7503aa547305a9f0df504ffbbfb1648ae623dcf6a519359f22943e0a53f"} Apr 20 17:59:41.148216 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.148150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6549bb64c6-njqf8" Apr 20 17:59:41.155687 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.155669 2577 scope.go:117] "RemoveContainer" containerID="4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992" Apr 20 17:59:41.155941 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:59:41.155924 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992\": container with ID starting with 4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992 not found: ID does not exist" containerID="4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992" Apr 20 17:59:41.156004 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.155951 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992"} err="failed to get container status \"4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992\": rpc error: code = NotFound desc = could not find container \"4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992\": container with ID starting with 4721a0512bdabd95da16451374fc83b71eb1ec085f9baab553d88ce3e7d75992 not found: ID does not exist" Apr 20 17:59:41.163464 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.163444 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-tpd9s"] Apr 20 17:59:41.166898 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.166877 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-tpd9s"] Apr 20 17:59:41.185245 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.185215 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6549bb64c6-njqf8"] Apr 20 17:59:41.187449 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.187430 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6549bb64c6-njqf8"] Apr 20 17:59:41.204180 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.204130 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" podStartSLOduration=1.83009781 podStartE2EDuration="2.204115995s" podCreationTimestamp="2026-04-20 17:59:39 +0000 UTC" firstStartedPulling="2026-04-20 17:59:40.242952975 +0000 UTC m=+682.028308192" lastFinishedPulling="2026-04-20 17:59:40.616971144 +0000 UTC m=+682.402326377" observedRunningTime="2026-04-20 17:59:41.201686744 +0000 UTC m=+682.987041982" watchObservedRunningTime="2026-04-20 17:59:41.204115995 +0000 UTC m=+682.989471233" Apr 20 17:59:41.228914 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.228884 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcx26"] Apr 20 17:59:41.229175 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.229145 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-tcx26" podUID="d97f89fe-c28e-42fa-8da8-d388b44d0c39" containerName="authorino" containerID="cri-o://06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7" gracePeriod=30 Apr 20 17:59:41.465489 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.465464 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tcx26" Apr 20 17:59:41.498697 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.498663 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6g5q\" (UniqueName: \"kubernetes.io/projected/d97f89fe-c28e-42fa-8da8-d388b44d0c39-kube-api-access-b6g5q\") pod \"d97f89fe-c28e-42fa-8da8-d388b44d0c39\" (UID: \"d97f89fe-c28e-42fa-8da8-d388b44d0c39\") " Apr 20 17:59:41.500648 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.500619 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97f89fe-c28e-42fa-8da8-d388b44d0c39-kube-api-access-b6g5q" (OuterVolumeSpecName: "kube-api-access-b6g5q") pod "d97f89fe-c28e-42fa-8da8-d388b44d0c39" (UID: "d97f89fe-c28e-42fa-8da8-d388b44d0c39"). InnerVolumeSpecName "kube-api-access-b6g5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:59:41.599491 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.599445 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6g5q\" (UniqueName: \"kubernetes.io/projected/d97f89fe-c28e-42fa-8da8-d388b44d0c39-kube-api-access-b6g5q\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:59:41.999437 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.999402 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lg28c"] Apr 20 17:59:41.999766 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.999754 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d97f89fe-c28e-42fa-8da8-d388b44d0c39" containerName="authorino" Apr 20 17:59:41.999824 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.999768 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97f89fe-c28e-42fa-8da8-d388b44d0c39" containerName="authorino" Apr 20 17:59:41.999824 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.999781 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb1edc88-88bb-46b8-879f-87a591bd9503" containerName="authorino" Apr 20 17:59:41.999824 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.999787 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1edc88-88bb-46b8-879f-87a591bd9503" containerName="authorino" Apr 20 17:59:41.999933 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.999847 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d97f89fe-c28e-42fa-8da8-d388b44d0c39" containerName="authorino" Apr 20 17:59:41.999933 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:41.999856 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb1edc88-88bb-46b8-879f-87a591bd9503" containerName="authorino" Apr 20 17:59:42.003079 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.003058 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" Apr 20 17:59:42.005509 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.005485 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-c858h\"" Apr 20 17:59:42.013740 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.013712 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lg28c"] Apr 20 17:59:42.104726 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.104693 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbdbz\" (UniqueName: \"kubernetes.io/projected/2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022-kube-api-access-xbdbz\") pod \"maas-controller-6d4c8f55f9-lg28c\" (UID: \"2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022\") " pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" Apr 20 17:59:42.154133 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.154099 2577 generic.go:358] "Generic (PLEG): container finished" podID="d97f89fe-c28e-42fa-8da8-d388b44d0c39" containerID="06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7" exitCode=0 Apr 20 17:59:42.154297 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.154145 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tcx26" Apr 20 17:59:42.154297 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.154169 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tcx26" event={"ID":"d97f89fe-c28e-42fa-8da8-d388b44d0c39","Type":"ContainerDied","Data":"06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7"} Apr 20 17:59:42.154297 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.154202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tcx26" event={"ID":"d97f89fe-c28e-42fa-8da8-d388b44d0c39","Type":"ContainerDied","Data":"0b5ef5821bc70c4c0aac7d09782c0003e7c4be8ff29474ae8415b7cdb05c0249"} Apr 20 17:59:42.154297 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.154220 2577 scope.go:117] "RemoveContainer" containerID="06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7" Apr 20 17:59:42.163580 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.163563 2577 scope.go:117] "RemoveContainer" containerID="06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7" Apr 20 17:59:42.163840 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:59:42.163822 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7\": container with ID starting with 06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7 not found: ID does not exist" containerID="06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7" Apr 20 17:59:42.163885 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.163850 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7"} err="failed to get container status \"06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7\": rpc error: code = NotFound desc = could not find container \"06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7\": container with ID starting with 06d2cc869e6e37af02bdd56b8c6b1b39c39d63de570a2528a5d332623f9326c7 not found: ID does not exist" Apr 20 17:59:42.190597 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.190568 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcx26"] Apr 20 17:59:42.193998 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.193967 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcx26"] Apr 20 17:59:42.205972 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.205943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbdbz\" (UniqueName: \"kubernetes.io/projected/2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022-kube-api-access-xbdbz\") pod \"maas-controller-6d4c8f55f9-lg28c\" (UID: \"2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022\") " pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" Apr 20 17:59:42.214002 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.213968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbdbz\" (UniqueName: \"kubernetes.io/projected/2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022-kube-api-access-xbdbz\") pod \"maas-controller-6d4c8f55f9-lg28c\" (UID: \"2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022\") " pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" Apr 20 17:59:42.283809 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.283724 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lg28c"] Apr 20 17:59:42.284033 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.284021 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" Apr 20 17:59:42.409381 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.409355 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lg28c"] Apr 20 17:59:42.411336 ip-10-0-135-49 kubenswrapper[2577]: W0420 17:59:42.411301 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cbf3dcb_52ce_4bcc_bbbd_4d9cf0c12022.slice/crio-ba253b4915118c3b6f33f23086d4ab5406de579c76efec0ebd55c01a9e7ed546 WatchSource:0}: Error finding container ba253b4915118c3b6f33f23086d4ab5406de579c76efec0ebd55c01a9e7ed546: Status 404 returned error can't find the container with id ba253b4915118c3b6f33f23086d4ab5406de579c76efec0ebd55c01a9e7ed546 Apr 20 17:59:42.696169 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.696087 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed" path="/var/lib/kubelet/pods/2d53f2c4-e0b2-478c-9a7b-968ccf2d6bed/volumes" Apr 20 17:59:42.696513 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.696322 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d97f89fe-c28e-42fa-8da8-d388b44d0c39" path="/var/lib/kubelet/pods/d97f89fe-c28e-42fa-8da8-d388b44d0c39/volumes" Apr 20 17:59:42.696601 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:42.696588 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb1edc88-88bb-46b8-879f-87a591bd9503" path="/var/lib/kubelet/pods/eb1edc88-88bb-46b8-879f-87a591bd9503/volumes" Apr 20 17:59:43.159801 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:43.159743 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" event={"ID":"2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022","Type":"ContainerStarted","Data":"ba253b4915118c3b6f33f23086d4ab5406de579c76efec0ebd55c01a9e7ed546"} Apr 20 17:59:45.172601 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:45.172560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" event={"ID":"2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022","Type":"ContainerStarted","Data":"f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19"} Apr 20 17:59:45.173030 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:45.172660 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" podUID="2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022" containerName="manager" containerID="cri-o://f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19" gracePeriod=10 Apr 20 17:59:45.173030 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:45.172782 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" Apr 20 17:59:45.192837 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:45.192789 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" podStartSLOduration=1.979630077 podStartE2EDuration="4.192775149s" podCreationTimestamp="2026-04-20 17:59:41 +0000 UTC" firstStartedPulling="2026-04-20 17:59:42.412672216 +0000 UTC m=+684.198027433" lastFinishedPulling="2026-04-20 17:59:44.625817288 +0000 UTC m=+686.411172505" observedRunningTime="2026-04-20 17:59:45.190076917 +0000 UTC m=+686.975432157" watchObservedRunningTime="2026-04-20 17:59:45.192775149 +0000 UTC m=+686.978130435" Apr 20 17:59:45.414391 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:45.414366 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" Apr 20 17:59:45.532882 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:45.532849 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbdbz\" (UniqueName: \"kubernetes.io/projected/2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022-kube-api-access-xbdbz\") pod \"2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022\" (UID: \"2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022\") " Apr 20 17:59:45.535050 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:45.535024 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022-kube-api-access-xbdbz" (OuterVolumeSpecName: "kube-api-access-xbdbz") pod "2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022" (UID: "2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022"). InnerVolumeSpecName "kube-api-access-xbdbz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:59:45.634374 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:45.634340 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbdbz\" (UniqueName: \"kubernetes.io/projected/2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022-kube-api-access-xbdbz\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 17:59:46.177786 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.177748 2577 generic.go:358] "Generic (PLEG): container finished" podID="2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022" containerID="f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19" exitCode=0 Apr 20 17:59:46.178283 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.177823 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" Apr 20 17:59:46.178283 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.177827 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" event={"ID":"2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022","Type":"ContainerDied","Data":"f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19"} Apr 20 17:59:46.178283 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.177865 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-lg28c" event={"ID":"2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022","Type":"ContainerDied","Data":"ba253b4915118c3b6f33f23086d4ab5406de579c76efec0ebd55c01a9e7ed546"} Apr 20 17:59:46.178283 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.177884 2577 scope.go:117] "RemoveContainer" containerID="f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19" Apr 20 17:59:46.187702 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.187679 2577 scope.go:117] "RemoveContainer" containerID="f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19" Apr 20 17:59:46.187957 ip-10-0-135-49 kubenswrapper[2577]: E0420 17:59:46.187937 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19\": container with ID starting with f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19 not found: ID does not exist" containerID="f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19" Apr 20 17:59:46.188066 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.187969 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19"} err="failed to get container status \"f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19\": rpc error: code = NotFound desc = could not find container \"f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19\": container with ID starting with f171c1066366b2e337cb9fa12012a364dc65ee2882a4e43963b50411bdf7af19 not found: ID does not exist" Apr 20 17:59:46.199193 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.199169 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lg28c"] Apr 20 17:59:46.202622 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.202590 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lg28c"] Apr 20 17:59:46.695943 ip-10-0-135-49 kubenswrapper[2577]: I0420 17:59:46.695910 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022" path="/var/lib/kubelet/pods/2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022/volumes" Apr 20 18:00:27.651854 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.651821 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7fff6f9b47-n4vsv"] Apr 20 18:00:27.652357 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.652189 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022" containerName="manager" Apr 20 18:00:27.652357 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.652201 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022" containerName="manager" Apr 20 18:00:27.652357 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.652261 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cbf3dcb-52ce-4bcc-bbbd-4d9cf0c12022" containerName="manager" Apr 20 18:00:27.658708 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.658691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:27.661306 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.661283 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 18:00:27.662386 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.662252 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-xj288\"" Apr 20 18:00:27.662504 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.662477 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 18:00:27.666187 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.666162 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7fff6f9b47-n4vsv"] Apr 20 18:00:27.709912 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.709883 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9knv9\" (UniqueName: \"kubernetes.io/projected/626ec792-5a53-45a3-9054-448c6f918495-kube-api-access-9knv9\") pod \"maas-api-7fff6f9b47-n4vsv\" (UID: \"626ec792-5a53-45a3-9054-448c6f918495\") " pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:27.710072 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.709934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/626ec792-5a53-45a3-9054-448c6f918495-maas-api-tls\") pod \"maas-api-7fff6f9b47-n4vsv\" (UID: \"626ec792-5a53-45a3-9054-448c6f918495\") " pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:27.810492 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.810460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/626ec792-5a53-45a3-9054-448c6f918495-maas-api-tls\") pod \"maas-api-7fff6f9b47-n4vsv\" (UID: \"626ec792-5a53-45a3-9054-448c6f918495\") " pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:27.810642 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.810546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9knv9\" (UniqueName: \"kubernetes.io/projected/626ec792-5a53-45a3-9054-448c6f918495-kube-api-access-9knv9\") pod \"maas-api-7fff6f9b47-n4vsv\" (UID: \"626ec792-5a53-45a3-9054-448c6f918495\") " pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:27.812863 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.812839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/626ec792-5a53-45a3-9054-448c6f918495-maas-api-tls\") pod \"maas-api-7fff6f9b47-n4vsv\" (UID: \"626ec792-5a53-45a3-9054-448c6f918495\") " pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:27.819746 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.819723 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9knv9\" (UniqueName: \"kubernetes.io/projected/626ec792-5a53-45a3-9054-448c6f918495-kube-api-access-9knv9\") pod \"maas-api-7fff6f9b47-n4vsv\" (UID: \"626ec792-5a53-45a3-9054-448c6f918495\") " pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:27.970740 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:27.970711 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:28.092822 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:28.092798 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7fff6f9b47-n4vsv"] Apr 20 18:00:28.095047 ip-10-0-135-49 kubenswrapper[2577]: W0420 18:00:28.095020 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod626ec792_5a53_45a3_9054_448c6f918495.slice/crio-3133c22234238898e45a807f04b59ce985d7bc5ea426fcaa2987ef333d200e4a WatchSource:0}: Error finding container 3133c22234238898e45a807f04b59ce985d7bc5ea426fcaa2987ef333d200e4a: Status 404 returned error can't find the container with id 3133c22234238898e45a807f04b59ce985d7bc5ea426fcaa2987ef333d200e4a Apr 20 18:00:28.350806 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:28.350721 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fff6f9b47-n4vsv" event={"ID":"626ec792-5a53-45a3-9054-448c6f918495","Type":"ContainerStarted","Data":"3133c22234238898e45a807f04b59ce985d7bc5ea426fcaa2987ef333d200e4a"} Apr 20 18:00:30.359936 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:30.359895 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fff6f9b47-n4vsv" event={"ID":"626ec792-5a53-45a3-9054-448c6f918495","Type":"ContainerStarted","Data":"c7578d82e6157b1456c4a734c35b98b2a2d14fc40bf2eb47686b36c0d2bcc324"} Apr 20 18:00:30.360325 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:30.359999 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:30.378746 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:30.378698 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7fff6f9b47-n4vsv" podStartSLOduration=1.683298876 podStartE2EDuration="3.378682769s" podCreationTimestamp="2026-04-20 18:00:27 +0000 UTC" firstStartedPulling="2026-04-20 18:00:28.096836002 +0000 UTC m=+729.882191236" lastFinishedPulling="2026-04-20 18:00:29.792219897 +0000 UTC m=+731.577575129" observedRunningTime="2026-04-20 18:00:30.375301547 +0000 UTC m=+732.160656788" watchObservedRunningTime="2026-04-20 18:00:30.378682769 +0000 UTC m=+732.164037986" Apr 20 18:00:36.369829 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:36.369796 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7fff6f9b47-n4vsv" Apr 20 18:00:38.422790 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.422756 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh"] Apr 20 18:00:38.426244 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.426227 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.429784 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.429758 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 18:00:38.429902 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.429873 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 18:00:38.429950 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.429910 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-5v8cn\"" Apr 20 18:00:38.430019 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.429946 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 18:00:38.439660 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.439634 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh"] Apr 20 18:00:38.509824 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.509789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.510014 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.509843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.510014 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.509868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.510014 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.509898 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.510014 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.509914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.510014 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.509939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r6kz\" (UniqueName: \"kubernetes.io/projected/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-kube-api-access-6r6kz\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.610697 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.610658 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.610697 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.610699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.610969 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.610834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.610969 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.610894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.610969 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.610920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r6kz\" (UniqueName: \"kubernetes.io/projected/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-kube-api-access-6r6kz\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.611152 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.611087 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.611152 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.611091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.611312 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.611190 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.611400 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.611380 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.613059 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.613042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.613249 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.613231 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.618943 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.618920 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r6kz\" (UniqueName: \"kubernetes.io/projected/d9d7f519-6ca5-4425-98e2-7de6f0fc0f50-kube-api-access-6r6kz\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh\" (UID: \"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.736880 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.736853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:38.865721 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:38.865694 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh"] Apr 20 18:00:38.867465 ip-10-0-135-49 kubenswrapper[2577]: W0420 18:00:38.867430 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9d7f519_6ca5_4425_98e2_7de6f0fc0f50.slice/crio-8020275d12c2a063b008a91db97d93de0b889069a86f875c0ba2356688397f2c WatchSource:0}: Error finding container 8020275d12c2a063b008a91db97d93de0b889069a86f875c0ba2356688397f2c: Status 404 returned error can't find the container with id 8020275d12c2a063b008a91db97d93de0b889069a86f875c0ba2356688397f2c Apr 20 18:00:39.397474 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:39.397428 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" event={"ID":"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50","Type":"ContainerStarted","Data":"8020275d12c2a063b008a91db97d93de0b889069a86f875c0ba2356688397f2c"} Apr 20 18:00:44.421281 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:44.421245 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" event={"ID":"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50","Type":"ContainerStarted","Data":"f172e6630d58d28f7794eac88525be9c670da7d535d05c02cb483dbba578d229"} Apr 20 18:00:49.444148 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:49.444063 2577 generic.go:358] "Generic (PLEG): container finished" podID="d9d7f519-6ca5-4425-98e2-7de6f0fc0f50" containerID="f172e6630d58d28f7794eac88525be9c670da7d535d05c02cb483dbba578d229" exitCode=0 Apr 20 18:00:49.444492 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:49.444137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" event={"ID":"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50","Type":"ContainerDied","Data":"f172e6630d58d28f7794eac88525be9c670da7d535d05c02cb483dbba578d229"} Apr 20 18:00:54.464860 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:54.464817 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" event={"ID":"d9d7f519-6ca5-4425-98e2-7de6f0fc0f50","Type":"ContainerStarted","Data":"4a28bdccc4330fc0076c328df0c410306ff9de1fa13f36389b87f7aafb39a211"} Apr 20 18:00:54.465264 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:54.465145 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:00:54.486003 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:00:54.485935 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" podStartSLOduration=1.6995186850000001 podStartE2EDuration="16.485921622s" podCreationTimestamp="2026-04-20 18:00:38 +0000 UTC" firstStartedPulling="2026-04-20 18:00:38.869277079 +0000 UTC m=+740.654632297" lastFinishedPulling="2026-04-20 18:00:53.655680006 +0000 UTC m=+755.441035234" observedRunningTime="2026-04-20 18:00:54.482689521 +0000 UTC m=+756.268044761" watchObservedRunningTime="2026-04-20 18:00:54.485921622 +0000 UTC m=+756.271276861" Apr 20 18:01:04.330234 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.330197 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr"] Apr 20 18:01:04.367617 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.367584 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr"] Apr 20 18:01:04.367794 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.367728 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.370496 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.370472 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 18:01:04.551262 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.551229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac24e6f7-f49d-428b-9d83-1d893a5d9142-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.551262 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.551267 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.551514 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.551291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.551514 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.551313 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.551514 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.551431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rccvl\" (UniqueName: \"kubernetes.io/projected/ac24e6f7-f49d-428b-9d83-1d893a5d9142-kube-api-access-rccvl\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.551514 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.551499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.652475 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.652384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rccvl\" (UniqueName: \"kubernetes.io/projected/ac24e6f7-f49d-428b-9d83-1d893a5d9142-kube-api-access-rccvl\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.652475 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.652445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.652720 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.652483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac24e6f7-f49d-428b-9d83-1d893a5d9142-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.652720 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.652501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.652720 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.652525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.652720 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.652553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.652981 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.652936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.652981 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.652954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.653108 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.653030 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.654951 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.654930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac24e6f7-f49d-428b-9d83-1d893a5d9142-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.655077 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.655059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac24e6f7-f49d-428b-9d83-1d893a5d9142-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.667823 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.667803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rccvl\" (UniqueName: \"kubernetes.io/projected/ac24e6f7-f49d-428b-9d83-1d893a5d9142-kube-api-access-rccvl\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8xdxr\" (UID: \"ac24e6f7-f49d-428b-9d83-1d893a5d9142\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.679132 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.679112 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:04.806042 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:04.806014 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr"] Apr 20 18:01:04.807255 ip-10-0-135-49 kubenswrapper[2577]: W0420 18:01:04.807226 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac24e6f7_f49d_428b_9d83_1d893a5d9142.slice/crio-2a9e291afef0da9977f24e65d9b00de9f0880d7a0727ceb97db1e47fd8f73595 WatchSource:0}: Error finding container 2a9e291afef0da9977f24e65d9b00de9f0880d7a0727ceb97db1e47fd8f73595: Status 404 returned error can't find the container with id 2a9e291afef0da9977f24e65d9b00de9f0880d7a0727ceb97db1e47fd8f73595 Apr 20 18:01:05.482445 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:05.482414 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh" Apr 20 18:01:05.508963 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:05.508922 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" event={"ID":"ac24e6f7-f49d-428b-9d83-1d893a5d9142","Type":"ContainerStarted","Data":"005f88e5f83d92267fee57c83c13b0d4347bd6919027e5973f374ec2780a2cb7"} Apr 20 18:01:05.508963 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:05.508967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" event={"ID":"ac24e6f7-f49d-428b-9d83-1d893a5d9142","Type":"ContainerStarted","Data":"2a9e291afef0da9977f24e65d9b00de9f0880d7a0727ceb97db1e47fd8f73595"} Apr 20 18:01:08.137629 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.137593 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx"] Apr 20 18:01:08.140972 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.140955 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.143416 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.143397 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 18:01:08.150641 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.150616 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx"] Apr 20 18:01:08.287403 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.287364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.287569 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.287416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.287569 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.287497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4345ca71-ef4b-48a8-a77b-3fc556be13e7-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.287643 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.287564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.287643 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.287632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdck6\" (UniqueName: \"kubernetes.io/projected/4345ca71-ef4b-48a8-a77b-3fc556be13e7-kube-api-access-kdck6\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.287721 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.287706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.388450 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.388355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.388450 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.388405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4345ca71-ef4b-48a8-a77b-3fc556be13e7-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.388450 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.388422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.388732 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.388455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdck6\" (UniqueName: \"kubernetes.io/projected/4345ca71-ef4b-48a8-a77b-3fc556be13e7-kube-api-access-kdck6\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.388732 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.388492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.388732 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.388522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.388912 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.388848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.388912 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.388897 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.389020 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.388935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.390869 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.390838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4345ca71-ef4b-48a8-a77b-3fc556be13e7-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.391094 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.391077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4345ca71-ef4b-48a8-a77b-3fc556be13e7-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.396538 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.396514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdck6\" (UniqueName: \"kubernetes.io/projected/4345ca71-ef4b-48a8-a77b-3fc556be13e7-kube-api-access-kdck6\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx\" (UID: \"4345ca71-ef4b-48a8-a77b-3fc556be13e7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.453304 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.453268 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:08.594785 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:08.594757 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx"] Apr 20 18:01:08.595139 ip-10-0-135-49 kubenswrapper[2577]: W0420 18:01:08.595112 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4345ca71_ef4b_48a8_a77b_3fc556be13e7.slice/crio-bc5a4efb64010551f000d050f01c0916f040bef8bfa80f3c2d54e6f87471216e WatchSource:0}: Error finding container bc5a4efb64010551f000d050f01c0916f040bef8bfa80f3c2d54e6f87471216e: Status 404 returned error can't find the container with id bc5a4efb64010551f000d050f01c0916f040bef8bfa80f3c2d54e6f87471216e Apr 20 18:01:09.528827 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:09.528787 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" event={"ID":"4345ca71-ef4b-48a8-a77b-3fc556be13e7","Type":"ContainerStarted","Data":"618fa0db75a52d243a3c6f5bcd3a90828c4263d879c8c4d6a395cca250950f1f"} Apr 20 18:01:09.528827 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:09.528829 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" event={"ID":"4345ca71-ef4b-48a8-a77b-3fc556be13e7","Type":"ContainerStarted","Data":"bc5a4efb64010551f000d050f01c0916f040bef8bfa80f3c2d54e6f87471216e"} Apr 20 18:01:10.534943 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:10.534911 2577 generic.go:358] "Generic (PLEG): container finished" podID="ac24e6f7-f49d-428b-9d83-1d893a5d9142" containerID="005f88e5f83d92267fee57c83c13b0d4347bd6919027e5973f374ec2780a2cb7" exitCode=0 Apr 20 18:01:10.535401 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:10.535015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" event={"ID":"ac24e6f7-f49d-428b-9d83-1d893a5d9142","Type":"ContainerDied","Data":"005f88e5f83d92267fee57c83c13b0d4347bd6919027e5973f374ec2780a2cb7"} Apr 20 18:01:11.541653 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:11.541616 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" event={"ID":"ac24e6f7-f49d-428b-9d83-1d893a5d9142","Type":"ContainerStarted","Data":"b27d85f85e98a9f524a3ca04c13bfef6c81ef9ab389dea2168a51f392a43bc14"} Apr 20 18:01:11.542061 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:11.541838 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:11.560758 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:11.560704 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" podStartSLOduration=7.257556491 podStartE2EDuration="7.560689327s" podCreationTimestamp="2026-04-20 18:01:04 +0000 UTC" firstStartedPulling="2026-04-20 18:01:10.535818598 +0000 UTC m=+772.321173816" lastFinishedPulling="2026-04-20 18:01:10.838951426 +0000 UTC m=+772.624306652" observedRunningTime="2026-04-20 18:01:11.558396258 +0000 UTC m=+773.343751498" watchObservedRunningTime="2026-04-20 18:01:11.560689327 +0000 UTC m=+773.346044599" Apr 20 18:01:14.555213 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:14.555179 2577 generic.go:358] "Generic (PLEG): container finished" podID="4345ca71-ef4b-48a8-a77b-3fc556be13e7" containerID="618fa0db75a52d243a3c6f5bcd3a90828c4263d879c8c4d6a395cca250950f1f" exitCode=0 Apr 20 18:01:14.555607 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:14.555250 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" event={"ID":"4345ca71-ef4b-48a8-a77b-3fc556be13e7","Type":"ContainerDied","Data":"618fa0db75a52d243a3c6f5bcd3a90828c4263d879c8c4d6a395cca250950f1f"} Apr 20 18:01:17.567939 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:17.567903 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" event={"ID":"4345ca71-ef4b-48a8-a77b-3fc556be13e7","Type":"ContainerStarted","Data":"a523f2726397b8634fa1670a8f6e7764ec27593c36c8e8c5d9a474c0bf46e8e3"} Apr 20 18:01:17.568604 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:17.568146 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:01:17.588863 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:17.588814 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" podStartSLOduration=7.183955929 podStartE2EDuration="9.58880146s" podCreationTimestamp="2026-04-20 18:01:08 +0000 UTC" firstStartedPulling="2026-04-20 18:01:14.555897262 +0000 UTC m=+776.341252479" lastFinishedPulling="2026-04-20 18:01:16.960742789 +0000 UTC m=+778.746098010" observedRunningTime="2026-04-20 18:01:17.587090673 +0000 UTC m=+779.372445907" watchObservedRunningTime="2026-04-20 18:01:17.58880146 +0000 UTC m=+779.374156703" Apr 20 18:01:22.558300 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:22.558265 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8xdxr" Apr 20 18:01:28.584811 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:01:28.584779 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx" Apr 20 18:02:06.282355 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.282317 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-864bb794df-qpc7j"] Apr 20 18:02:06.285699 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.285680 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-864bb794df-qpc7j" Apr 20 18:02:06.295688 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.295660 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-864bb794df-qpc7j"] Apr 20 18:02:06.310509 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.310485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7892b523-b102-4f01-80c8-ec1afd3caf8d-tls-cert\") pod \"authorino-864bb794df-qpc7j\" (UID: \"7892b523-b102-4f01-80c8-ec1afd3caf8d\") " pod="kuadrant-system/authorino-864bb794df-qpc7j" Apr 20 18:02:06.310640 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.310541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flg4w\" (UniqueName: \"kubernetes.io/projected/7892b523-b102-4f01-80c8-ec1afd3caf8d-kube-api-access-flg4w\") pod \"authorino-864bb794df-qpc7j\" (UID: \"7892b523-b102-4f01-80c8-ec1afd3caf8d\") " pod="kuadrant-system/authorino-864bb794df-qpc7j" Apr 20 18:02:06.411523 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.411489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7892b523-b102-4f01-80c8-ec1afd3caf8d-tls-cert\") pod \"authorino-864bb794df-qpc7j\" (UID: \"7892b523-b102-4f01-80c8-ec1afd3caf8d\") " pod="kuadrant-system/authorino-864bb794df-qpc7j" Apr 20 18:02:06.411699 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.411548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flg4w\" (UniqueName: \"kubernetes.io/projected/7892b523-b102-4f01-80c8-ec1afd3caf8d-kube-api-access-flg4w\") pod \"authorino-864bb794df-qpc7j\" (UID: \"7892b523-b102-4f01-80c8-ec1afd3caf8d\") " pod="kuadrant-system/authorino-864bb794df-qpc7j" Apr 20 18:02:06.413922 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.413901 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7892b523-b102-4f01-80c8-ec1afd3caf8d-tls-cert\") pod \"authorino-864bb794df-qpc7j\" (UID: \"7892b523-b102-4f01-80c8-ec1afd3caf8d\") " pod="kuadrant-system/authorino-864bb794df-qpc7j" Apr 20 18:02:06.419526 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.419502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flg4w\" (UniqueName: \"kubernetes.io/projected/7892b523-b102-4f01-80c8-ec1afd3caf8d-kube-api-access-flg4w\") pod \"authorino-864bb794df-qpc7j\" (UID: \"7892b523-b102-4f01-80c8-ec1afd3caf8d\") " pod="kuadrant-system/authorino-864bb794df-qpc7j" Apr 20 18:02:06.595836 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.595758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-864bb794df-qpc7j" Apr 20 18:02:06.725724 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.725698 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-864bb794df-qpc7j"] Apr 20 18:02:06.727527 ip-10-0-135-49 kubenswrapper[2577]: W0420 18:02:06.727489 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7892b523_b102_4f01_80c8_ec1afd3caf8d.slice/crio-12b6ff841897a970ba7ec2f7d72f23c64f198215e9599edf3541da42c0e70fbc WatchSource:0}: Error finding container 12b6ff841897a970ba7ec2f7d72f23c64f198215e9599edf3541da42c0e70fbc: Status 404 returned error can't find the container with id 12b6ff841897a970ba7ec2f7d72f23c64f198215e9599edf3541da42c0e70fbc Apr 20 18:02:06.755866 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:06.755834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-864bb794df-qpc7j" event={"ID":"7892b523-b102-4f01-80c8-ec1afd3caf8d","Type":"ContainerStarted","Data":"12b6ff841897a970ba7ec2f7d72f23c64f198215e9599edf3541da42c0e70fbc"} Apr 20 18:02:07.761392 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:07.761352 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-864bb794df-qpc7j" event={"ID":"7892b523-b102-4f01-80c8-ec1afd3caf8d","Type":"ContainerStarted","Data":"2d6bd784d70551d6f48ed5dc21e6f1dcc58bcc1d01744a9897ce832f3c765e35"} Apr 20 18:02:07.782420 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:07.782357 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-864bb794df-qpc7j" podStartSLOduration=1.396628691 podStartE2EDuration="1.782338726s" podCreationTimestamp="2026-04-20 18:02:06 +0000 UTC" firstStartedPulling="2026-04-20 18:02:06.728643594 +0000 UTC m=+828.513998810" lastFinishedPulling="2026-04-20 18:02:07.11435362 +0000 UTC m=+828.899708845" observedRunningTime="2026-04-20 18:02:07.781601094 +0000 UTC m=+829.566956345" watchObservedRunningTime="2026-04-20 18:02:07.782338726 +0000 UTC m=+829.567693980" Apr 20 18:02:07.815931 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:07.815893 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-cbb499d4d-wwhmk"] Apr 20 18:02:07.816224 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:07.816190 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" podUID="22a83f7a-9bad-4b2c-bd25-ad0994772b41" containerName="authorino" containerID="cri-o://6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad" gracePeriod=30 Apr 20 18:02:08.057435 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.057413 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 18:02:08.127463 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.127427 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/22a83f7a-9bad-4b2c-bd25-ad0994772b41-tls-cert\") pod \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\" (UID: \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\") " Apr 20 18:02:08.127645 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.127474 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbfjh\" (UniqueName: \"kubernetes.io/projected/22a83f7a-9bad-4b2c-bd25-ad0994772b41-kube-api-access-cbfjh\") pod \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\" (UID: \"22a83f7a-9bad-4b2c-bd25-ad0994772b41\") " Apr 20 18:02:08.129559 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.129532 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a83f7a-9bad-4b2c-bd25-ad0994772b41-kube-api-access-cbfjh" (OuterVolumeSpecName: "kube-api-access-cbfjh") pod "22a83f7a-9bad-4b2c-bd25-ad0994772b41" (UID: "22a83f7a-9bad-4b2c-bd25-ad0994772b41"). InnerVolumeSpecName "kube-api-access-cbfjh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 18:02:08.137727 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.137701 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22a83f7a-9bad-4b2c-bd25-ad0994772b41-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "22a83f7a-9bad-4b2c-bd25-ad0994772b41" (UID: "22a83f7a-9bad-4b2c-bd25-ad0994772b41"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 18:02:08.228718 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.228684 2577 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/22a83f7a-9bad-4b2c-bd25-ad0994772b41-tls-cert\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 18:02:08.228718 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.228712 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cbfjh\" (UniqueName: \"kubernetes.io/projected/22a83f7a-9bad-4b2c-bd25-ad0994772b41-kube-api-access-cbfjh\") on node \"ip-10-0-135-49.ec2.internal\" DevicePath \"\"" Apr 20 18:02:08.766511 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.766482 2577 generic.go:358] "Generic (PLEG): container finished" podID="22a83f7a-9bad-4b2c-bd25-ad0994772b41" containerID="6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad" exitCode=0 Apr 20 18:02:08.767015 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.766535 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" Apr 20 18:02:08.767015 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.766568 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" event={"ID":"22a83f7a-9bad-4b2c-bd25-ad0994772b41","Type":"ContainerDied","Data":"6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad"} Apr 20 18:02:08.767015 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.766610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-cbb499d4d-wwhmk" event={"ID":"22a83f7a-9bad-4b2c-bd25-ad0994772b41","Type":"ContainerDied","Data":"71eac7503aa547305a9f0df504ffbbfb1648ae623dcf6a519359f22943e0a53f"} Apr 20 18:02:08.767015 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.766634 2577 scope.go:117] "RemoveContainer" containerID="6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad" Apr 20 18:02:08.777424 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.777399 2577 scope.go:117] "RemoveContainer" containerID="6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad" Apr 20 18:02:08.777945 ip-10-0-135-49 kubenswrapper[2577]: E0420 18:02:08.777923 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad\": container with ID starting with 6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad not found: ID does not exist" containerID="6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad" Apr 20 18:02:08.778076 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.777953 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad"} err="failed to get container status \"6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad\": rpc error: code = NotFound desc = could not find container \"6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad\": container with ID starting with 6ff60c61137ff870ea1a497d7a76bb409ea395170f4f8f187b4ae269f839c5ad not found: ID does not exist" Apr 20 18:02:08.783359 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.783292 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-cbb499d4d-wwhmk"] Apr 20 18:02:08.786730 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:08.786704 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-cbb499d4d-wwhmk"] Apr 20 18:02:10.701060 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:02:10.701017 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a83f7a-9bad-4b2c-bd25-ad0994772b41" path="/var/lib/kubelet/pods/22a83f7a-9bad-4b2c-bd25-ad0994772b41/volumes" Apr 20 18:03:19.985625 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:19.985599 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 18:03:19.987917 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:19.987896 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 18:03:35.413699 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:35.413668 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-864bb794df-qpc7j_7892b523-b102-4f01-80c8-ec1afd3caf8d/authorino/0.log" Apr 20 18:03:39.327373 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:39.327338 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7fff6f9b47-n4vsv_626ec792-5a53-45a3-9054-448c6f918495/maas-api/0.log" Apr 20 18:03:39.826151 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:39.826120 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b8c4c7886-lhhn7_2d761368-405c-4e31-ab43-47d4afe6b6e2/manager/0.log" Apr 20 18:03:40.831033 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:40.831003 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp_6194d1ad-0fa3-4f55-b3fe-24f471e54b15/util/0.log" Apr 20 18:03:40.837169 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:40.837148 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp_6194d1ad-0fa3-4f55-b3fe-24f471e54b15/pull/0.log" Apr 20 18:03:40.842556 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:40.842523 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp_6194d1ad-0fa3-4f55-b3fe-24f471e54b15/extract/0.log" Apr 20 18:03:40.946327 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:40.946293 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb_5328da85-247b-4778-89e8-12410591e7f3/pull/0.log" Apr 20 18:03:40.952450 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:40.952419 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb_5328da85-247b-4778-89e8-12410591e7f3/extract/0.log" Apr 20 18:03:40.957935 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:40.957918 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb_5328da85-247b-4778-89e8-12410591e7f3/util/0.log" Apr 20 18:03:41.062233 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:41.062202 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r_6c1b2d85-15bd-467f-8571-646e3db81e8a/util/0.log" Apr 20 18:03:41.068015 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:41.067980 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r_6c1b2d85-15bd-467f-8571-646e3db81e8a/pull/0.log" Apr 20 18:03:41.073604 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:41.073585 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r_6c1b2d85-15bd-467f-8571-646e3db81e8a/extract/0.log" Apr 20 18:03:41.175699 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:41.175614 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd_a395cb58-561d-4a66-a684-4e96d4cb3db5/extract/0.log" Apr 20 18:03:41.181323 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:41.181300 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd_a395cb58-561d-4a66-a684-4e96d4cb3db5/util/0.log" Apr 20 18:03:41.187238 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:41.187215 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd_a395cb58-561d-4a66-a684-4e96d4cb3db5/pull/0.log" Apr 20 18:03:41.298663 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:41.298635 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-864bb794df-qpc7j_7892b523-b102-4f01-80c8-ec1afd3caf8d/authorino/0.log" Apr 20 18:03:41.608280 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:41.608252 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-tj2t6_5b245b6c-efe4-4d57-bce9-c26cf49dab9f/kuadrant-console-plugin/0.log" Apr 20 18:03:42.556254 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:42.556220 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-554dd5dd7d-qnpw7_57666fdc-66c3-46aa-b04d-f5251dea0b08/kube-auth-proxy/0.log" Apr 20 18:03:43.181254 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:43.181215 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh_d9d7f519-6ca5-4425-98e2-7de6f0fc0f50/storage-initializer/0.log" Apr 20 18:03:43.188098 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:43.188067 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-j7frh_d9d7f519-6ca5-4425-98e2-7de6f0fc0f50/main/0.log" Apr 20 18:03:43.292820 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:43.292783 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx_4345ca71-ef4b-48a8-a77b-3fc556be13e7/storage-initializer/0.log" Apr 20 18:03:43.299914 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:43.299891 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-2ddxx_4345ca71-ef4b-48a8-a77b-3fc556be13e7/main/0.log" Apr 20 18:03:43.404960 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:43.404932 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-8xdxr_ac24e6f7-f49d-428b-9d83-1d893a5d9142/storage-initializer/0.log" Apr 20 18:03:43.412345 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:43.412311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-8xdxr_ac24e6f7-f49d-428b-9d83-1d893a5d9142/main/0.log" Apr 20 18:03:50.411955 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:50.411923 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-h7dtn_542dd1d3-eb84-486c-a8ee-46b247e169f8/global-pull-secret-syncer/0.log" Apr 20 18:03:50.565346 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:50.565312 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l85wm_0c2d21f8-03e3-423b-a4e7-4ab1bd770001/konnectivity-agent/0.log" Apr 20 18:03:50.610777 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:50.610742 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-49.ec2.internal_99727bb875db005dc5ab40f5d2dc2824/haproxy/0.log" Apr 20 18:03:54.521376 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.521345 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp_6194d1ad-0fa3-4f55-b3fe-24f471e54b15/extract/0.log" Apr 20 18:03:54.547454 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.547425 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp_6194d1ad-0fa3-4f55-b3fe-24f471e54b15/util/0.log" Apr 20 18:03:54.569977 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.569955 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kxpmp_6194d1ad-0fa3-4f55-b3fe-24f471e54b15/pull/0.log" Apr 20 18:03:54.597493 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.597465 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb_5328da85-247b-4778-89e8-12410591e7f3/extract/0.log" Apr 20 18:03:54.618512 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.618490 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb_5328da85-247b-4778-89e8-12410591e7f3/util/0.log" Apr 20 18:03:54.642442 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.642419 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0j89tb_5328da85-247b-4778-89e8-12410591e7f3/pull/0.log" Apr 20 18:03:54.671805 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.671783 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r_6c1b2d85-15bd-467f-8571-646e3db81e8a/extract/0.log" Apr 20 18:03:54.694081 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.694062 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r_6c1b2d85-15bd-467f-8571-646e3db81e8a/util/0.log" Apr 20 18:03:54.714874 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.714849 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73rkc6r_6c1b2d85-15bd-467f-8571-646e3db81e8a/pull/0.log" Apr 20 18:03:54.756177 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.756153 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd_a395cb58-561d-4a66-a684-4e96d4cb3db5/extract/0.log" Apr 20 18:03:54.784102 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.784046 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd_a395cb58-561d-4a66-a684-4e96d4cb3db5/util/0.log" Apr 20 18:03:54.812472 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.812446 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cpdhd_a395cb58-561d-4a66-a684-4e96d4cb3db5/pull/0.log" Apr 20 18:03:54.849185 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.849162 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-864bb794df-qpc7j_7892b523-b102-4f01-80c8-ec1afd3caf8d/authorino/0.log" Apr 20 18:03:54.930209 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:54.930183 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-tj2t6_5b245b6c-efe4-4d57-bce9-c26cf49dab9f/kuadrant-console-plugin/0.log" Apr 20 18:03:56.295388 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.295358 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4303a933-fab3-447a-9e7c-56cb1ac05945/alertmanager/0.log" Apr 20 18:03:56.319080 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.319051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4303a933-fab3-447a-9e7c-56cb1ac05945/config-reloader/0.log" Apr 20 18:03:56.343597 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.343573 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4303a933-fab3-447a-9e7c-56cb1ac05945/kube-rbac-proxy-web/0.log" Apr 20 18:03:56.379444 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.379418 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4303a933-fab3-447a-9e7c-56cb1ac05945/kube-rbac-proxy/0.log" Apr 20 18:03:56.419699 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.419669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4303a933-fab3-447a-9e7c-56cb1ac05945/kube-rbac-proxy-metric/0.log" Apr 20 18:03:56.474761 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.474731 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4303a933-fab3-447a-9e7c-56cb1ac05945/prom-label-proxy/0.log" Apr 20 18:03:56.518396 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.518365 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4303a933-fab3-447a-9e7c-56cb1ac05945/init-config-reloader/0.log" Apr 20 18:03:56.563058 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.562964 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-wnkt8_50c85d10-1859-45c6-8b15-0c1dfc8e482e/cluster-monitoring-operator/0.log" Apr 20 18:03:56.677366 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.677337 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-59bcd769fc-52sq2_8a74d5e0-5721-4316-a286-11a620733ba1/metrics-server/0.log" Apr 20 18:03:56.753480 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.753449 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lbj72_2f24fb25-fcc9-42ab-a59b-fc3368b09772/node-exporter/0.log" Apr 20 18:03:56.775356 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.775331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lbj72_2f24fb25-fcc9-42ab-a59b-fc3368b09772/kube-rbac-proxy/0.log" Apr 20 18:03:56.798852 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:56.798828 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lbj72_2f24fb25-fcc9-42ab-a59b-fc3368b09772/init-textfile/0.log" Apr 20 18:03:57.023935 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:57.023912 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8kwzq_1ef3feee-bbc8-4c86-8037-031d564a48f4/kube-rbac-proxy-main/0.log" Apr 20 18:03:57.046262 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:57.046235 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8kwzq_1ef3feee-bbc8-4c86-8037-031d564a48f4/kube-rbac-proxy-self/0.log" Apr 20 18:03:57.067770 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:57.067745 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8kwzq_1ef3feee-bbc8-4c86-8037-031d564a48f4/openshift-state-metrics/0.log" Apr 20 18:03:57.279680 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:57.279584 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ch9xb_574c60f9-5fff-4eb3-9e05-52b18b8d24b5/prometheus-operator/0.log" Apr 20 18:03:57.296849 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:57.296825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ch9xb_574c60f9-5fff-4eb3-9e05-52b18b8d24b5/kube-rbac-proxy/0.log" Apr 20 18:03:57.321676 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:57.321647 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-7pr62_89e5f4b0-2504-4bc5-bac9-06cc9892666b/prometheus-operator-admission-webhook/0.log" Apr 20 18:03:57.354911 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:57.354881 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b54957944-z5ggp_297b207e-fe93-479b-8abb-11c125da9ff9/telemeter-client/0.log" Apr 20 18:03:57.373657 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:57.373632 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b54957944-z5ggp_297b207e-fe93-479b-8abb-11c125da9ff9/reload/0.log" Apr 20 18:03:57.393601 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:57.393583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b54957944-z5ggp_297b207e-fe93-479b-8abb-11c125da9ff9/kube-rbac-proxy/0.log" Apr 20 18:03:59.171829 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.171790 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9"] Apr 20 18:03:59.172488 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.172313 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22a83f7a-9bad-4b2c-bd25-ad0994772b41" containerName="authorino" Apr 20 18:03:59.172488 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.172332 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a83f7a-9bad-4b2c-bd25-ad0994772b41" containerName="authorino" Apr 20 18:03:59.172488 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.172421 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="22a83f7a-9bad-4b2c-bd25-ad0994772b41" containerName="authorino" Apr 20 18:03:59.175827 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.175810 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.178140 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.178121 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dqzrr\"/\"kube-root-ca.crt\"" Apr 20 18:03:59.178334 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.178321 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dqzrr\"/\"openshift-service-ca.crt\"" Apr 20 18:03:59.179137 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.179123 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dqzrr\"/\"default-dockercfg-mm66t\"" Apr 20 18:03:59.186745 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.186719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9"] Apr 20 18:03:59.235069 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.235042 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-podres\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.235219 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.235077 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-sys\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.235219 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.235100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-proc\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.235219 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.235121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-lib-modules\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.235219 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.235183 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96pcx\" (UniqueName: \"kubernetes.io/projected/4622858e-5a65-4d8b-baf9-df95358503c7-kube-api-access-96pcx\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.336410 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.336377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-podres\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.336562 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.336419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-sys\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.336562 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.336442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-proc\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.336562 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.336488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-lib-modules\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.336562 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.336505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-sys\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.336562 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.336533 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-podres\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.336562 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.336535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96pcx\" (UniqueName: \"kubernetes.io/projected/4622858e-5a65-4d8b-baf9-df95358503c7-kube-api-access-96pcx\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.336760 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.336578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-proc\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.336760 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.336597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4622858e-5a65-4d8b-baf9-df95358503c7-lib-modules\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.344905 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.344878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96pcx\" (UniqueName: \"kubernetes.io/projected/4622858e-5a65-4d8b-baf9-df95358503c7-kube-api-access-96pcx\") pod \"perf-node-gather-daemonset-fr8c9\" (UID: \"4622858e-5a65-4d8b-baf9-df95358503c7\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.486612 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.486568 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:03:59.615453 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.615423 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9"] Apr 20 18:03:59.617070 ip-10-0-135-49 kubenswrapper[2577]: W0420 18:03:59.617044 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4622858e_5a65_4d8b_baf9_df95358503c7.slice/crio-50db975f573b31902b31136a69dde0494523f10d9d94fa2c6138375f23a3218e WatchSource:0}: Error finding container 50db975f573b31902b31136a69dde0494523f10d9d94fa2c6138375f23a3218e: Status 404 returned error can't find the container with id 50db975f573b31902b31136a69dde0494523f10d9d94fa2c6138375f23a3218e Apr 20 18:03:59.722828 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.722804 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7464d6c56c-49d7x_7960da98-e4cc-49e6-9e3b-8ccd24ad9f44/console/0.log" Apr 20 18:03:59.755571 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:03:59.755540 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-qc6vc_6e1318cc-1767-485b-b8cc-a2fbce6fcf9a/download-server/0.log" Apr 20 18:04:00.206225 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:00.206193 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" event={"ID":"4622858e-5a65-4d8b-baf9-df95358503c7","Type":"ContainerStarted","Data":"ebba98eee20b95f4938cce86fb00ae46ac581d801949c539267e9ee650b8b150"} Apr 20 18:04:00.206225 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:00.206230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" event={"ID":"4622858e-5a65-4d8b-baf9-df95358503c7","Type":"ContainerStarted","Data":"50db975f573b31902b31136a69dde0494523f10d9d94fa2c6138375f23a3218e"} Apr 20 18:04:00.206775 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:00.206309 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:04:00.221786 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:00.221729 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" podStartSLOduration=1.221716947 podStartE2EDuration="1.221716947s" podCreationTimestamp="2026-04-20 18:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 18:04:00.220574126 +0000 UTC m=+942.005929404" watchObservedRunningTime="2026-04-20 18:04:00.221716947 +0000 UTC m=+942.007072220" Apr 20 18:04:01.090304 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:01.090269 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cxr2b_bbbb6523-e9f1-4c90-a6e2-5288b46c08ad/dns/0.log" Apr 20 18:04:01.112699 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:01.112671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cxr2b_bbbb6523-e9f1-4c90-a6e2-5288b46c08ad/kube-rbac-proxy/0.log" Apr 20 18:04:01.202732 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:01.202700 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6k2dk_bbc16f1f-e425-42a6-9352-b92e465bc2c2/dns-node-resolver/0.log" Apr 20 18:04:01.696781 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:01.696748 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6dlf2_7b143fe4-9d02-4ed6-a139-f8b9c51e336d/node-ca/0.log" Apr 20 18:04:02.615259 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:02.615228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-554dd5dd7d-qnpw7_57666fdc-66c3-46aa-b04d-f5251dea0b08/kube-auth-proxy/0.log" Apr 20 18:04:03.236118 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:03.236089 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hxnms_07e93993-3b0e-409c-9665-f091ef7a8e5a/serve-healthcheck-canary/0.log" Apr 20 18:04:03.817352 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:03.817316 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5m2k9_f2230508-9373-4dc2-b94b-53c90c805046/insights-operator/1.log" Apr 20 18:04:03.817809 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:03.817788 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5m2k9_f2230508-9373-4dc2-b94b-53c90c805046/insights-operator/0.log" Apr 20 18:04:03.838226 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:03.838196 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2z57x_7ac46c7c-57b5-4d37-a219-aade31435133/kube-rbac-proxy/0.log" Apr 20 18:04:03.859111 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:03.859085 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2z57x_7ac46c7c-57b5-4d37-a219-aade31435133/exporter/0.log" Apr 20 18:04:03.882545 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:03.882524 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2z57x_7ac46c7c-57b5-4d37-a219-aade31435133/extractor/0.log" Apr 20 18:04:05.869843 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:05.869815 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7fff6f9b47-n4vsv_626ec792-5a53-45a3-9054-448c6f918495/maas-api/0.log" Apr 20 18:04:06.002753 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:06.002721 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b8c4c7886-lhhn7_2d761368-405c-4e31-ab43-47d4afe6b6e2/manager/0.log" Apr 20 18:04:06.219844 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:06.219817 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fr8c9" Apr 20 18:04:07.159032 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:07.159001 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7589d7b74d-nkcb6_c78dcf0a-c37b-40b4-a0de-c5f4a69890e2/manager/0.log" Apr 20 18:04:13.103215 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.103183 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6tdx8_774bfb19-2861-479a-a336-756a6a8d2926/kube-multus/0.log" Apr 20 18:04:13.490237 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.490209 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m2q9r_cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee/kube-multus-additional-cni-plugins/0.log" Apr 20 18:04:13.515928 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.515898 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m2q9r_cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee/egress-router-binary-copy/0.log" Apr 20 18:04:13.544464 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.544439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m2q9r_cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee/cni-plugins/0.log" Apr 20 18:04:13.566538 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.566507 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m2q9r_cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee/bond-cni-plugin/0.log" Apr 20 18:04:13.588708 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.588678 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m2q9r_cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee/routeoverride-cni/0.log" Apr 20 18:04:13.614109 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.614076 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m2q9r_cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee/whereabouts-cni-bincopy/0.log" Apr 20 18:04:13.640261 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.640224 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m2q9r_cf66c5a0-9dc4-4b6b-ac5f-ceb4cd1940ee/whereabouts-cni/0.log" Apr 20 18:04:13.800581 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.800490 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-skq27_14ea9252-57e5-4e09-9c9e-33d96e94d56f/network-metrics-daemon/0.log" Apr 20 18:04:13.825393 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:13.825368 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-skq27_14ea9252-57e5-4e09-9c9e-33d96e94d56f/kube-rbac-proxy/0.log" Apr 20 18:04:14.906094 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:14.906061 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-controller/0.log" Apr 20 18:04:14.922237 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:14.922148 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/0.log" Apr 20 18:04:14.931219 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:14.931196 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovn-acl-logging/1.log" Apr 20 18:04:14.955721 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:14.955691 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/kube-rbac-proxy-node/0.log" Apr 20 18:04:14.977063 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:14.977039 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 18:04:14.994400 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:14.994379 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/northd/0.log" Apr 20 18:04:15.016769 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:15.016747 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/nbdb/0.log" Apr 20 18:04:15.037098 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:15.037079 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/sbdb/0.log" Apr 20 18:04:15.210049 ip-10-0-135-49 kubenswrapper[2577]: I0420 18:04:15.209962 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxb8f_2c38c27a-adb3-46fb-9409-cec659c7a3c1/ovnkube-controller/0.log"